Posts Tagged ‘DevoxxBE2012’
[DevoxxBE2012] Re-imagining the Browser with AngularJS
Misko Hevery and Igor Minar, Google engineers and AngularJS co-leads, re-envisioned client-side development. Misko, an Agile coach with open-source contributions, partnered with Igor, focused on developer tools, to showcase AngularJS’s approach to simplifying web apps.
They posited extending the browser with declarative HTML and JavaScript, reducing code while enhancing readability. AngularJS bridges to future standards like web components and model-driven views.
Misko demonstrated data binding, where models sync with views automatically, eliminating manual DOM manipulation. Directives extend HTML, creating custom elements for reusability.
Igor highlighted dependency injection for modularity, and services for shared logic. Routing enables single-page apps, with controllers managing scopes.
They emphasized testability, with built-in mocking and end-to-end testing.
Declarative UI and Data Binding
Misko illustrated two-way binding: changes in models update views, and vice versa, without boilerplate. This declarative paradigm contrasts imperative jQuery-style coding.
Directives like ng-repeat generate lists dynamically, while filters format data.
Modularity and Dependency Management
Igor explained modules encapsulating functionality, injected via DI. This promotes clean, testable code.
Services, factories, and providers offer flexible creation patterns.
Routing and Application Structure
NgRoute handles navigation, loading templates and controllers. Scopes isolate data, with inheritance for hierarchy.
Testing and Future Alignment
Angular’s design facilitates unit and e2e tests, using Karma and Protractor.
They previewed alignment with web components, where directives become custom tags.
In Q&A, they compared to Knockout.js, noting Angular’s framework scope versus library focus.
Misko and Igor’s presentation framed AngularJS as transformative, anticipating browser evolutions while delivering immediate productivity.
Links:
[DevoxxBE2012] Real World Java EE
Adam Bien, a renowned Java consultant, author, and Java Champion with decades of experience in enterprise applications, delivered an impromptu yet insightful session on practical Java EE usage. Adam, known for books like “Real World Java EE Patterns,” shared real-world insights from his freelance projects, emphasizing simplicity and effectiveness over hype.
He started by recounting his journey since 1995, favoring Java EE for its robustness in production. Adam advocated thin WAR deployments, bundling everything into a single archive for easy management, contrasting with complex EAR structures.
Discussing boundaries, Adam promoted domain-driven design, where entities form the core, with services as facades. He cautioned against unnecessary abstractions, like excessive DAOs, favoring direct JPA usage.
Adam highlighted CDI’s power for dependency injection, reducing boilerplate. He demonstrated boundary-control-entity patterns, where boundaries handle transactions, controls manage logic, and entities persist data.
Deployment Strategies and Simplicity
Adam stressed deploying as WARs to containers like GlassFish, avoiding heavy setups. He shared experiences with microservices avant la lettre, using REST for inter-service communication.
He critiqued over-engineering, like misusing ESBs for simple integrations, preferring lightweight approaches.
Testing and Quality Assurance
Adam advocated comprehensive testing: unit with JUnit, integration with Arquillian, and UI with Selenium/Graphene. He demonstrated embedding containers for realistic tests.
He emphasized boundary testing, simulating real scenarios without mocks where possible.
Performance and Scalability
Discussing optimization, Adam noted Java EE’s built-in clustering, but advised measuring first. He shared cases where simple configurations sufficed, avoiding premature scaling.
Community and Best Practices
Adam encouraged open-source contributions, sharing his GitHub projects. He addressed common pitfalls, like session misuse, advocating stateless designs.
In Q&A, he discussed tools like Jenkins for CI and IDEs like IntelliJ.
Adam’s talk reinforced Java EE’s viability for real-world applications through pragmatic, lean practices.
Links:
[DevoxxBE2012] Home Automation for Geeks
Thomas Eichstädt-Engelen and Kai Kreuzer, both prominent figures in the open-source home automation scene, presented an engaging exploration of openHAB. Thomas, a senior consultant at innoQ with expertise in Eclipse technologies and OSGi, teamed up with Kai, a software architect at Deutsche Telekom specializing in IoT and smart homes, to demonstrate how openHAB transcends basic home control systems. Their session highlighted the project’s capabilities for geeks, running on affordable devices like the Raspberry Pi while offering advanced features such as presence simulation, sensor data visualization, and integration with calendars.
They began by challenging common perceptions of home automation, often limited to remote light switching or shutter control via smartphones. Kai and Thomas emphasized openHAB’s open-source ethos, allowing extensive customization beyond commercial offerings. The framework’s modular architecture, built on OSGi, enables easy extension to connect with diverse protocols and devices.
A live demo showcased openHAB’s runtime on embedded hardware, illustrating rule-based automation. For instance, they configured scenarios where motion sensors trigger lights or simulate occupancy during absences. Integration with Google Calendar for irrigation scheduling demonstrated practical, intelligent applications.
Thomas and Kai stressed the project’s appeal to Java and OSGi enthusiasts, featuring an Xbase-derived scripting language for defining complex logic. This allows developers to craft rules reacting to events like temperature changes or user inputs.
Core Concepts and Architecture
Kai outlined openHAB’s structure: a core runtime managing bindings to hardware protocols (e.g., Z-Wave, KNX), persistence services for data storage, and user interfaces. Bindings abstract device interactions, making the system protocol-agnostic. Persistence handles logging sensor data to databases like MySQL or InfluxDB for historical analysis.
Thomas highlighted the OSGi foundation, where bundles dynamically add functionality. This modularity supports community-contributed extensions, fostering a vibrant ecosystem.
Advanced Automation and Integration
The duo delved into rule engines, where scripts automate responses. Examples included voice commands via integrations or mobile apps notifying users of anomalies. They showcased charts displaying energy consumption or environmental metrics, aiding in optimization.
Integration with external services, like weather APIs for proactive heating adjustments, illustrated openHAB’s extensibility.
User Interfaces and Accessibility
Kai demonstrated multiple UIs: web-based dashboards, mobile apps, and even voice assistants. The sitemap concept organizes controls intuitively, while HABPanel offers customizable widgets.
Thomas addressed security, recommending VPNs for remote access and encrypted communications.
Community and Future Developments
They noted the growing community, with over 500 installations and active contributors. Future plans include simplified binding creation guides, archetypes for new developers, and enhanced UIs like MGWT.
In Q&A, they discussed hardware support and integration challenges, encouraging participation.
Thomas and Kai’s presentation positioned openHAB as a powerful, developer-friendly platform for innovative home automation, blending Java prowess with real-world utility.
Links:
[DevoxxBE2012] Architecture All the Way Down
Kirk Knoernschild, a software developer passionate about modular systems and author of “Java Application Architecture,” explored the pervasive nature of architecture in software. Kirk, drawing from his book on OSGi patterns, challenged traditional views, arguing architecture permeates all levels—from high-level designs to code.
He invoked the “turtles all the way down” anecdote to illustrate architecture’s recursive essence: decisions at every layer impact flexibility. Kirk critiqued ivory-tower approaches, advocating collaborative, iterative practices aligning business and technology.
Paradoxically, architecture aims for change resistance yet adaptability. Temporal dimensions—decisions’ longevity—affect modularity: stable elements form foundations, volatile ones remain flexible.
Kirk linked SOA’s service granularity to modularity, noting services as deployable units fostering reuse. He emphasized patterns ensuring evolvability without rigidity.
Demystifying Architectural Paradoxes
Kirk elaborated on architecture’s dual goals: stability against volatility. He used examples where over-design stifles agility, advocating minimal upfront planning with evolutionary refinement.
Temporal hierarchies classify decisions by change frequency: strategic (years), tactical (months), operational (days). This guides layering: stable cores support variable extensions.
Granularity and Modularity Principles
Discussing granularity, Kirk warned against extremes: monolithic systems hinder reuse; overly fine-grained increase complexity. Patterns like base and dependency injection promote loose coupling.
He showcased OSGi’s runtime modularity, enforcing boundaries via exports/imports, preventing spaghetti code.
Linking Design to Temporal Decisions
Kirk connected design principles—SOLID—to temporal aspects: single responsibility minimizes change impact; open-closed enables extension without modification.
He illustrated with code: classes as small modules, packages as mid-level, OSGi bundles as deployable.
SOA and Modular Synergies
In SOA, services mirror modules: autonomous, composable. Kirk advocated aligning service boundaries with business domains, using modularity patterns for internal structure.
He critiqued layered architectures fostering silos, preferring vertical slices for cohesion.
Practical Implementation and Tools
Kirk recommended modular frameworks like OSGi or Jigsaw, but stressed design paradigms over tools. Patterns catalog aids designing evolvable systems.
He concluded: multiple communication levels—classes to services—enhance understanding, urging focus on modularity for adaptive software.
Kirk’s insights reframed architecture as holistic, from code to enterprise, essential for enduring systems.
Links:
[DevoxxBE2012] 10 Months of MongoDB at Nokia Entertainment Bristol
Tom Coupland, a senior engineer at Nokia Entertainment Bristol with expertise in data-centric applications, shared the journey of adopting MongoDB within his team. Tom, focused on backend services for Nokia’s music app, described how a small group of developers introduced MongoDB, overcoming organizational hurdles to integrate it successfully.
He set the context: a team of about 40 developers building a service-oriented architecture behind mobile clients. This created numerous fine-grained services with distinct persistence needs, prompting exploration beyond traditional relational databases like Oracle or MySQL.
The motivation stemmed from simplicity and speed. Dissatisfied with ORM complexities in Hibernate, they sought alternatives. MongoDB’s schema-less design and JSON-like documents aligned with their data models, reducing mapping overhead.
Tom recounted the adoption process: starting with self-education via books and conferences, then prototyping a service. Positive results—faster development, easier scaling—led to pitching it internally. They emphasized MongoDB’s fit for document-oriented data, like user profiles, over relational joins.
Gaining acceptance involved demonstrating benefits: quicker iterations, no schema migrations during development, and horizontal scaling via sharding. Administrators appreciated operational simplicity, despite initial concerns over maturity.
Initial Exploration and Justification
Tom detailed early experiments: evaluating against Postgres, appreciating MongoDB’s query language and aggregation framework. They addressed CAP theorem trade-offs, opting for consistency over availability for their use cases.
Prototypes showcased rapid schema evolution without downtime, crucial for agile environments.
Implementation and Lessons Learned
In production, they used Java drivers with Jackson for serialization, avoiding ORMs like Morphia for control. Tom discussed indexing strategies, ensuring queries hit indexes via explain plans.
Challenges included data modeling: denormalizing for read efficiency, managing large arrays. They learned to monitor operations, using MMS for insights.
Performance tuning involved sharding keys selection, balancing distribution.
Organizational Integration and Expansion
Convincing peers involved code reviews showing cleaner implementations. Managers saw productivity gains.
Tom noted opening doors to experimentation: JVM languages like Scala, other stores like Neo4j.
He advised evaluating tools holistically, considering added complexities.
In Q&A, Tom clarified validation at application level and dismissal of Morphia for direct control.
His narrative illustrated grassroots adoption driving technological shift, emphasizing simplicity in complex ecosystems.
Links:
[DevoxxBE2012] FastOQL – Fast Object Queries for Hibernate
Srđan Luković, a software developer at SOL Software, alongside Žarko Mijailovic and Dragan Milicev from the University of Belgrade, presented a groundbreaking solution to a persistent challenge in Hibernate development. Žarko, a senior Java EE developer and PhD candidate with deep involvement in model-driven frameworks like SOLoist4, led the discussion on FastOQL, a Java library that transforms complex Object Query Language (OQL) statements into highly optimized SQL, addressing Hibernate’s HQL performance bottlenecks in large-scale databases.
The trio began by dissecting the limitations of HQL queries, which often generate inefficient joins when traversing class hierarchies or association tables, leading to sluggish execution on voluminous datasets. FastOQL emerges as a targeted remedy, compiling OQL into minimal-join SQL tailored for Hibernate environments. Srđan illustrated this with examples involving inheritance hierarchies and many-to-many relationships, where FastOQL drastically reduces query complexity without sacrificing the object-oriented expressiveness of OQL.
Žarko delved into the library’s design, emphasizing its derivation from SOL Software’s proprietary persistence layer, ensuring seamless integration as an HQL alternative. Dragan, an associate professor and department chair at the Faculty of Electrical Engineering, provided theoretical grounding, explaining how FastOQL’s strategy leverages specific mappings—like single-table inheritance and association tables—to eliminate unnecessary joins, yielding substantial performance gains in real-world scenarios.
A live demonstration highlighted FastOQL’s prowess: compiling an OQL query spanning multiple entities resulted in SQL with fewer tables and faster retrieval times compared to equivalent HQL. The speakers underscored its focus on prevalent Hibernate mappings, driven by practical observations from blogs, documentation, and industry recommendations. In Q&A, they addressed benchmarking queries, affirming that while initial efforts targeted these mappings for maximal impact, future expansions could encompass others, rooted in FastOQL’s extensible architecture.
FastOQL stands as a beacon for developers grappling with scalable persistence, marrying OQL’s conciseness with SQL’s efficiency to foster maintainable, high-velocity applications in enterprise settings.
Tackling HQL’s Performance Hurdles
Žarko unpacked HQL’s pitfalls, where intricate joins across polymorphic classes inflate query costs. FastOQL counters this by analyzing object structures to prune redundant associations, delivering lean SQL that preserves relational integrity while accelerating data access.
OQL Compilation Mechanics
Srđan demonstrated the compilation pipeline, where OQL expressions map directly to optimized SQL via Hibernate’s session factory. This process ensures type-safe queries remain portable, sidestepping the verbosity of native SQL while inheriting Hibernate’s caching benefits.
Real-World Mapping Strategies
Dragan highlighted FastOQL’s affinity for common patterns, such as table-per-class hierarchies and intermediary tables for collections. By prioritizing these, the library achieves dramatic throughput improvements, particularly in inheritance-heavy domains like content management or e-commerce.
Integration and Future Prospects
The presentation touched on FastOQL’s Hibernate-centric origins, with plans to broaden mapping support. Žarko encouraged exploration via SOL Software’s resources, positioning it as a vital evolution for object-relational mapping in demanding environments.
Links:
[DevoxxBE2012] What’s New in Groovy 2.0?
Guillaume Laforge, the Groovy Project Lead and a key figure in its development since its inception, provided an extensive overview of Groovy’s advancements. Guillaume, employed by the SpringSource division of VMware at the time, highlighted how Groovy enhances developer efficiency and runtime speed with each iteration. He began by recapping essential elements from Groovy 1.8 before delving into the innovations of version 2.0, emphasizing its role as a versatile language on the JVM.
Guillaume underscored Groovy’s appeal as a scripting alternative to Java, offering dynamic capabilities while allowing modular usage for those not requiring full dynamism. He illustrated this with examples of seamless integration, such as embedding Groovy scripts in Java applications for flexible configurations. This approach reduces boilerplate and fosters rapid prototyping without sacrificing compatibility.
Transitioning to performance, Guillaume discussed optimizations in method invocation and arithmetic operations, which contribute to faster execution. He also touched on library enhancements, like improved date handling and JSON support, which streamline common tasks in enterprise environments.
A significant portion focused on modularity in Groovy 2.0, where the core is split into smaller jars, enabling selective inclusion of features like XML processing or SQL support. This granularity aids in lightweight deployments, particularly in constrained settings.
Static Type Checking for Reliability
Guillaume elaborated on static type checking, a flagship feature allowing early error detection without runtime overhead. He demonstrated annotating classes with @TypeChecked to enforce type safety, catching mismatches in assignments or method calls at compile time. This is particularly beneficial for large codebases, where dynamic typing might introduce subtle bugs.
He addressed extensions for domain-specific languages, ensuring type inference works even in complex scenarios like builder patterns. Guillaume showed how this integrates with IDEs for better code completion and refactoring support.
Static Compilation for Performance
Another cornerstone, static compilation via @CompileStatic, generates bytecode akin to Java’s, bypassing dynamic dispatch for speed gains. Guillaume benchmarked scenarios where this yields up to tenfold improvements, ideal for performance-critical sections.
He clarified that dynamic features remain available selectively, allowing hybrid approaches. This flexibility positions Groovy as a bridge between scripting ease and compiled efficiency.
InvokeDynamic Integration and Future Directions
Guillaume explored JDK7’s invokedynamic support, optimizing dynamic calls for better throughput. He presented metrics showing substantial gains in invocation-heavy code, aligning Groovy closer to Java’s performance.
Looking ahead, he previewed Groovy 2.1 enhancements, including refined type checking for DSLs and complete invokedynamic coverage. For Groovy 3.0, a revamped meta-object protocol and Java 8 lambda compatibility were on the horizon, with Groovy 4.0 adopting ANTLR4 for parsing.
In Q&A, Guillaume addressed migration paths and community contributions, reinforcing Groovy’s evolution as responsive to user needs.
His session portrayed Groovy as maturing into a robust, adaptable toolset for modern JVM development, balancing dynamism with rigor.
Links:
[DevoxxBE2012] Why & How: JSON Validation with JSON Schema and Jackson
Stephane Rondal, co-founder of Arexo Consulting and a Java EE expert, introduced JSON Schema validation using Jackson. Stephane, with decades in software architecture, explained JSON’s ubiquity in web 2.0, mobile, and RESTful services, yet noted lacking structural validation compared to XML.
He advocated JSON Schema for defining constraints like types, formats, and cardinalities, ensuring data integrity. Benefits include self-documenting APIs, early error detection, and improved interoperability.
Drawbacks: added complexity, performance overhead, and evolving standards (draft 3 then, now higher).
Stephane demonstrated schema creation for documents with headers and items, specifying required fields and enums.
Using Jackson’s JsonSchema module, he validated instances, catching issues like type mismatches.
Implementing Validation in Practice
Stephane integrated validation post-parsing, using ObjectMapper and JsonNode for tree traversal. Tests showed valid/invalid responses, with errors reported clearly.
He recommended the Jackson-based validator for its maintenance and spec adherence.
Links:
[DevoxxBE2012] Back to the Future: Taking Arduino Back to Its Java Roots to Move Forward
James Caska, creator of VirtualBreadboard and Muvium, presented on revitalizing Arduino by reconnecting it to its Java origins. James, focused on bridging software and hardware, argued that Arduino’s evolution from Processing has led to fragmentation, and his Muvium V18’O plugin restores Java compatibility for enhanced development.
He traced Arduino’s lineage: from Processing (Java-based), forking to Wiring, then Arduino, and MPIDE. This divergence created “Not-Quite-Java” (NQJ), limiting features like objects and exceptions.
Muvium integrates an Ahead-Of-Time compiler, USB HID programmer, and emulator into Processing, supporting full Java. Benefits include ecosystem ties, teaching suitability, dynamic features, and emulation for testing.
James demonstrated installation, emulation of circuits, and code execution on emulated Arduino, showing seamless virtual-to-real transitions.
He emphasized Java’s advantages for complex projects, with threading and libraries expanding Arduino’s scope.
Historical Context and Evolution Challenges
James outlined Processing’s artistic roots, evolving to Wiring for hardware, then Arduino’s accessibility focus. Forks caused incompatibilities, straying from Java’s power.
Muvium reintroduces Java, compiling to bytecode for microcontrollers like PIC, with potential AVR/ARM ports.
Practical Demonstration and Features
In demos, James showed VBB emulating breadboards, integrating Muvium for Java coding. Features like garbage collection and inheritance simplify sharing.
Emulation tests code virtually, ideal for education and collaboration.
Future Expansions and Community Call
James discussed multicore support leveraging Java threads, and FPGA integrations for smaller footprints.
He invited contributions to Frappuccino libraries, broadening Arduino’s appeal to Java developers.
James’s talk positioned Muvium as a forward step, merging Arduino’s simplicity with Java’s robustness.
Links:
[DevoxxBE2012] When Geek Leaks
Neal Ford, a software architect at ThoughtWorks and author known for his work on enterprise applications, delivered a keynote exploring “geek leaking”—the spillover of deep expertise from one domain into another, fostering innovation. Neal, an international speaker with insights into design and delivery, tied this concept to his book “Presentation Patterns,” but expanded it to broader intellectual pursuits.
He defined “geek” as an enthusiast whose passion in one area influences others, creating synergies. Neal illustrated with examples like Richard Feynman’s interdisciplinary contributions, from physics to biology, showing how questioning fundamentals drives breakthroughs.
Neal connected this to software, urging developers to apply scientific methods—hypothesis, experimentation, analysis—to projects. He critiqued over-reliance on authority, advocating first-principles thinking to challenge assumptions.
Drawing from history, Neal discussed how paradigm shifts, like Galileo’s heliocentrism, exemplify geek leaking by integrating new evidence across fields.
In technology, he highlighted tools enabling this, such as domain-specific languages blending syntaxes for efficiency.
Origins of Intellectual Cross-Pollination
Neal traced geek leaking to Feynman’s life, where physics informed lock-picking and bongo playing, emphasizing curiosity over rote knowledge. He paralleled this to software, where patterns from one language inspire another.
He referenced Thomas Kuhn’s “Structure of Scientific Revolutions,” explaining how anomalies lead to paradigm shifts, akin to evolving tech stacks.
Applying Scientific Rigor in Development
Neal advocated embracing hypotheses in coding, testing ideas empirically rather than debating theoretically. He cited examples like performance tuning, where measurements debunk intuitions.
He introduced the “jeweler’s hammer”—gentle taps revealing flaws—urging subtle probes in designs to uncover weaknesses early.
Historical Lessons and Modern Tools
Discussing Challenger disaster, Neal showed Feynman’s simple demonstration exposing engineering flaws, stressing clarity in communication.
He critiqued poor presentations, linking to Edward Tufte’s analysis of Columbia shuttle slides, where buried details caused tragedy.
Neal promoted tools like DSLs for expressive code, and polyglot programming to borrow strengths across languages.
Fostering Innovation Through Curiosity
Encouraging geek leaking, Neal suggested exploring adjacent fields, like biology informing algorithms (genetic programming).
He emphasized self-skepticism, quoting Feynman on fooling oneself, and applying scientific method to validate ideas.
Neal concluded by urging first-principles reevaluation, ensuring solutions align with core problems, not outdated assumptions.
His keynote inspired developers to let expertise leak, driving creative, robust solutions.