Posts Tagged ‘DevoxxBE2023’
[DevoxxBE2023] How Sand and Java Create the World’s Most Powerful Chips
Johan Janssen, an architect at ASML, captivated the DevoxxBE2023 audience with a deep dive into the intricate process of chip manufacturing and the role of Java in optimizing it. Johan, a seasoned speaker and JavaOne Rock Star, explained how ASML’s advanced lithography machines, powered by Java-based software, enable the creation of cutting-edge computer chips used in devices worldwide.
From Sand to Silicon Wafers
Johan began by demystifying chip production, starting with silica sand, an abundant resource transformed into silicon ingots and sliced into wafers. These wafers, approximately 30 cm in diameter, serve as the foundation for chips, hosting up to 600 chips per wafer or thousands for smaller sensors. He passed around a wafer adorned with Java’s mascot, Duke, illustrating the physical substrate of modern electronics.
The process involves printing multiple layers—up to 200—onto wafers using extreme ultraviolet (EUV) lithography machines. These machines, requiring four Boeing 747s for transport, achieve precision at the nanometer scale, with transistors as small as three nanometers. Johan likened this to driving a car 300 km and retracing the path with only 2 mm deviation, highlighting the extraordinary accuracy required.
The Role of EUV Lithography
Johan detailed the EUV lithography process, where tin droplets are hit by a 40-kilowatt laser to generate plasma at sun-like temperatures, producing EUV light. This light, directed by ultra-flat mirrors, patterns wafers through reticles costing €250,000 each. The process demands cleanroom environments, as even a single dust particle can ruin a chip, and involves continuous calibration to maintain precision across thousands of parameters.
ASML’s machines, some over 30 years old, remain in use for producing sensors and less advanced chips, demonstrating their longevity. Johan also previewed future advancements, such as high numerical aperture (NA) machines, which will enable even smaller transistors, further enhancing chip performance and energy efficiency.
Java-Powered Analytics Platform
At the heart of Johan’s talk was ASML’s Java-based analytics platform, which processes 31 terabytes of data weekly to optimize chip production. Built on Apache Spark, the platform distributes computations across worker nodes, supporting plugins for data ingestion, UI customization, and processing. These plugins allow departments to integrate diverse data types, from images to raw measurements, and support languages like Julia and C alongside Java.
The platform, running on-premise to protect sensitive data, consolidates previously disparate applications, improving efficiency and user experience. Johan highlighted a machine learning use case where the platform increased defect detection from 70% to 92% without slowing production, showcasing Java’s role in handling complex computations.
Challenges and Solutions in Chip Manufacturing
Johan discussed challenges like layer misalignment, which can cause short circuits or defective chips. The platform addresses these by analyzing wafer plots to identify correctable errors, such as adjusting subsequent layers to compensate for misalignments. Non-correctable errors may result in downgrading chips (e.g., from 16 GB to 8 GB RAM), ensuring minimal waste.
He emphasized a pragmatic approach to tool selection, starting with REST endpoints and gradually adopting Kafka for streaming data as needs evolved. Johan also noted ASML’s collaboration with tool maintainers to enhance compatibility, such as improving Spark’s progress tracking for customer feedback.
Future of Chip Manufacturing
Looking ahead, Johan highlighted the industry’s push to diversify chip production beyond Taiwan, driven by geopolitical and economic factors. However, building new factories, or “fabs,” costing $10–20 billion, faces challenges like equipment backlogs and the need for highly skilled operators. ASML’s customer support teams, working alongside clients like Intel, underscore the specialized knowledge required.
Johan concluded by stressing the importance of a forward-looking mindset, with ASML’s roadmap prioritizing innovation over rigid methodologies. This approach, combined with Java’s robustness, ensures the platform’s scalability and adaptability in a rapidly evolving industry.
Links:
[DevoxxBE2023] Moving Java Forward Together: Community Power
Sharat Chander, Oracle’s Senior Director of Java Developer Engagement, delivered a compelling session at DevoxxBE2023, emphasizing the Java community’s pivotal role in driving the language’s evolution. With over 25 years in the IT industry, Sharat’s passion for Java and community engagement shone through as he outlined how developers can contribute to Java’s future, ensuring its relevance for decades to come.
The Legacy and Longevity of Java
Sharat began by reflecting on Java’s 28-year journey, a testament to its enduring impact on software development. He engaged the audience with a poll, revealing the diverse experience levels among attendees, from those using Java for five years to veterans with over 25 years of expertise. This diversity underscores Java’s broad adoption across industries, from small startups to large enterprises.
Java’s success, Sharat argued, stems from its thoughtful innovation strategy. Unlike the “move fast and break things” mantra, the Java team prioritizes stability and backward compatibility, ensuring that applications built on older versions remain functional. Projects like Amber, Panama, and the recent introduction of virtual threads in Java 21 exemplify this incremental yet impactful approach to innovation.
Balancing Stability and Progress
Sharat addressed the tension between rapid innovation and maintaining stability, a challenge given Java’s extensive history. He highlighted the six-month release cadence introduced to reduce latency to innovation, allowing developers to adopt new features without waiting for major releases. This approach, likened to a train arriving every three minutes, minimizes disruption and enhances accessibility.
The Java team’s commitment to trust, innovation, and predictability guides its development process. Sharat emphasized that Java’s design principles—established 28 years ago—continue to shape its evolution, ensuring it meets the needs of diverse applications, from AI and big data to emerging fields like quantum computing.
Community as the Heart of Java
The core of Sharat’s message was the community’s role in Java’s vitality. He debunked the “build it and they will come” myth, stressing that Java’s success relies on active community participation. Programs like the OpenJDK project invite developers to engage with mailing lists, review code check-ins, and contribute to technical decisions, fostering transparency and collaboration.
Sharat also highlighted foundational programs like the Java Community Process (JCP) and Java Champions, who advocate for Java independently, providing critical feedback to the Java team. He encouraged attendees to join Java User Groups (JUGs), noting the nearly 400 groups worldwide as vital hubs for knowledge sharing and networking.
Digital Engagement and Future Initiatives
Recognizing the digital era’s impact, Sharat discussed Oracle’s efforts to reach Java’s 10 million developers through platforms like dev.java. This portal aggregates learning resources, community content, and programs like JEEP Cafe and Sip of Java, which offer digestible insights into Java’s features. The recently launched Java Playground provides a browser-based environment for experimenting with code snippets, accelerating feature adoption.
Sharat also announced the community contributions initiative on dev.java, featuring content from Java Champions like Venkat Subramaniam and Hannes Kutz. This platform aims to showcase community expertise, encouraging developers to submit their best practices via GitHub pull requests.
Nurturing Diversity and Inclusion
A poignant moment in Sharat’s talk was his call for greater gender diversity in the Java community. He acknowledged the industry’s shortcomings in achieving balanced representation and urged collective action to expand the community’s mindshare. Programs like JDuchess aim to create inclusive spaces, ensuring Java’s evolution benefits from diverse perspectives.
Links:
[DevoxxBE2023] Making Your @Beans Intelligent: Spring AI Innovations
At DevoxxBE2023, Dr. Mark Pollack delivered an insightful presentation on integrating artificial intelligence into Java applications using Spring AI, a project inspired by advancements in AI frameworks like LangChain and LlamaIndex. Mark, a seasoned Spring developer since 2003 and leader of the Spring Data project, explored how Java developers can harness pre-trained AI models to create intelligent applications that address real-world challenges. His talk introduced the audience to Spring AI’s capabilities, from simple “Hello World” examples to sophisticated use cases like question-and-answer systems over custom documents.
The Genesis of Spring AI
Mark began by sharing his journey into AI, sparked by the transformative impact of ChatGPT. Unlike traditional AI development, which often required extensive data cleaning and model training, pre-trained models like those from OpenAI offer accessible APIs and vast knowledge bases, enabling developers to focus on application engineering rather than data science. Mark highlighted how Spring AI emerged from his exploration of code generation, leveraging the structured nature of code within these models to create a framework tailored for Java developers. This framework abstracts the complexity of AI model interactions, making it easier to integrate AI into Spring-based applications.
Spring AI draws inspiration from Python’s AI ecosystem but adapts these concepts to Java’s idioms, emphasizing component abstractions and pluggability. Mark emphasized that this is not a direct port but a reimagination, aligning with the Spring ecosystem’s strengths in enterprise integration and batch processing. This approach positions Spring AI as a bridge between Java’s robust software engineering practices and the dynamic world of AI.
Core Components of AI Applications
A significant portion of Mark’s presentation focused on the architecture of AI applications, which extends beyond merely calling a model. He introduced a conceptual framework involving contextual data, AI frameworks, and models. Contextual data, akin to ETL (Extract, Transform, Load) processes, involves parsing and transforming data—such as PDFs—into embeddings stored in vector databases. These embeddings enable efficient similarity searches, crucial for use cases like question-and-answer systems.
Mark demonstrated a simple AI client in Spring AI, which abstracts interactions with various AI models, including OpenAI, Hugging Face, Amazon Bedrock, and Google Vertex. This portability allows developers to switch models without significant code changes. He also showcased the Spring CLI, a tool inspired by JavaScript’s Create React App, which simplifies project setup by generating starter code from existing repositories.
Prompt Engineering and Its Importance
Prompt engineering emerged as a critical theme in Mark’s talk. He explained that crafting effective prompts is essential for directing AI models to produce desired outputs, such as JSON-formatted responses or specific styles of answers. Spring AI’s PromptTemplate class facilitates this by allowing developers to create reusable, stateful templates with placeholders for dynamic content. Mark illustrated this with a demo where a prompt template generated a joke about a raccoon, highlighting the importance of roles (system and user) in defining the context and tone of AI responses.
He also touched on the concept of “dogfooding,” where AI models are used to refine prompts, creating a feedback loop that enhances their effectiveness. This iterative process, combined with evaluation techniques, ensures that applications deliver accurate and relevant responses, addressing challenges like model hallucinations—where AI generates plausible but incorrect information.
Retrieval Augmented Generation (RAG)
Mark introduced Retrieval Augmented Generation (RAG), a technique to overcome the limitations of AI models’ context windows, which restrict the amount of data they can process. RAG involves pre-processing data into smaller fragments, converting them into embeddings, and storing them in vector databases for similarity searches. This approach allows developers to provide only relevant data to the model, improving efficiency and accuracy.
In a demo, Mark showcased RAG with a bicycle shop dataset, where a question about city-commuting bikes retrieved relevant product descriptions from a vector store. This process mirrors traditional search engines but leverages AI to synthesize answers, demonstrating how Spring AI integrates with vector databases like Milvus and PostgreSQL to handle complex queries.
Real-World Applications and Future Directions
Mark highlighted practical applications of Spring AI, such as enabling question-and-answer systems for financial documents, medical records, or government programs like Medicaid. These use cases illustrate AI’s potential to make complex information more accessible, particularly for non-technical users. He also discussed the importance of evaluation in AI development, advocating for automated scoring mechanisms to assess response quality beyond simple test passing.
Looking forward, Mark outlined Spring AI’s roadmap, emphasizing robust core abstractions and support for a growing number of models and vector databases. He encouraged developers to explore the project’s GitHub repository and participate in its evolution, underscoring the rapid pace of AI advancements and the need for community involvement.
Links:
[DevoxxBE2023] Securing the Supply Chain for Your Java Applications by Thomas Vitale
At Devoxx Belgium 2023, Thomas Vitale, a software engineer and architect at Systematic, delivered an authoritative session on securing the software supply chain for Java applications. As the author of Cloud Native Spring in Action and a passionate advocate for cloud-native technologies, Thomas provided a comprehensive exploration of securing every stage of the software lifecycle, from source code to deployment. Drawing on the SLSA framework and CNCF research, he demonstrated practical techniques for ensuring integrity, authenticity, and resilience using open-source tools like Gradle, Sigstore, and Kyverno. Through a blend of theoretical insights and live demonstrations, Thomas illuminated the critical importance of supply chain security in today’s threat landscape.
Safeguarding Source Code with Git Signatures
Thomas began by defining the software supply chain as the end-to-end process of delivering software, encompassing code, dependencies, tools, practices, and people. He emphasized the risks at each stage, starting with source code. Using Git as an example, Thomas highlighted its audit trail capabilities but cautioned that commit authorship can be manipulated. In a live demo, he showed how he could impersonate a colleague by altering Git’s username and email, underscoring the need for signed commits. By enforcing signed commits with GPG or SSH keys—or preferably a keyless approach via GitHub’s single sign-on—developers can ensure commit authenticity, establishing a verifiable provenance trail critical for supply chain security.
Managing Dependencies with Software Bills of Materials (SBOMs)
Moving to dependencies, Thomas stressed the importance of knowing exactly what libraries are included in a project, especially given vulnerabilities like Log4j. He introduced Software Bills of Materials (SBOMs) as a standardized inventory of software components, akin to a list of ingredients. Using the CycloneDX plugin for Gradle, Thomas demonstrated generating an SBOM during the build process, which provides precise dependency details, including versions, licenses, and hashes for integrity verification. This approach, integrated into Maven or Gradle, ensures accuracy over post-build scanning tools like Snyk, enabling developers to identify vulnerabilities, check license compliance, and verify component integrity before production.
Thomas further showcased Dependency-Track, an OWASP project, to analyze SBOMs and flag vulnerabilities, such as a critical issue in SnakeYAML. He introduced the Vulnerability Exploitability Exchange (VEX) standard, which complements SBOMs by documenting whether vulnerabilities affect an application. In his demo, Thomas marked a SnakeYAML vulnerability as a false positive due to Spring Boot’s safe deserialization, demonstrating how VEX communicates security decisions to stakeholders, reducing unnecessary alerts and ensuring compliance with emerging regulations.
Building Secure Artifacts with Reproducible Builds
The build phase, Thomas explained, is another critical juncture for security. Using Spring Boot as an example, he outlined three packaging methods: JAR files, native executables, and container images. He critiqued Dockerfiles for introducing non-determinism and maintenance overhead, advocating for Cloud Native Buildpacks as a reproducible, secure alternative. In a demo, Thomas built a container image with Buildpacks, highlighting its fixed creation timestamp (January 1, 1980) to ensure identical outputs for unchanged inputs, enhancing security by eliminating variability. This reproducibility, coupled with SBOM generation during the build, ensures artifacts are both secure and traceable.
Signing and Verifying Artifacts with SLSA
To ensure artifact integrity, Thomas introduced the SLSA framework, which provides guidelines for securing software artifacts across the supply chain. He demonstrated signing container images with Sigstore’s Cosign tool, using a keyless approach to avoid managing private keys. This process, integrated into a GitHub Actions pipeline, ensures that artifacts are authentically linked to their creator. Thomas further showcased SLSA’s provenance generation, which documents the artifact’s origin, including the Git commit hash and build steps. By achieving SLSA Level 3, his pipeline provided non-falsifiable provenance, ensuring traceability from source code to deployment.
Securing Deployments with Policy Enforcement
The final stage, deployment, requires validating artifacts to ensure they meet security standards. Thomas demonstrated using Cosign and the SLSA Verifier to validate signatures and provenance, ensuring only trusted artifacts are deployed. On Kubernetes, he introduced Kyverno, a policy engine that enforces signature and provenance checks, automatically rejecting non-compliant deployments. This approach ensures that production environments remain secure, aligning with the principle of validating metadata to prevent unauthorized or tampered artifacts from running.
Conclusion: A Holistic Approach to Supply Chain Security
Thomas’s session at Devoxx Belgium 2023 provided a robust framework for securing Java application supply chains. By addressing source code integrity, dependency management, build reproducibility, artifact signing, and deployment validation, he offered a comprehensive strategy to mitigate risks. His practical demonstrations, grounded in open-source tools and standards like SLSA and VEX, empowered developers to adopt these practices without overwhelming complexity. Thomas’s emphasis on asking “why” at each step encouraged attendees to tailor security measures to their context, ensuring both compliance and resilience in an increasingly regulated landscape.
Links:
[DevoxxBE2023] REST Next Level: Crafting Domain-Driven Web APIs by Julien Topçu
At Devoxx Belgium 2023, Julien Topçu, a technical coach at Shadow, delivered a compelling session on elevating REST APIs by embedding domain-driven design principles. With a rich background in crafting software using Domain-Driven Design (DDD), Extreme Programming, and Kanban, Julien illuminated the pitfalls of traditional REST implementations and proposed a transformative approach to encapsulate business intent within APIs. His talk, centered around a fictional space travel booking system, demonstrated how to align APIs with user actions, preserve business workflows, and enhance consumer experience through hypermedia controls. Through a blend of theoretical insights and practical demonstrations, Julien showcased a methodology to create APIs that are not only functional but also semantically rich and workflow-driven.
The Pitfalls of Traditional REST APIs
Julien began by highlighting a pervasive issue in software architecture: the loss of business intent when translating domain logic into REST APIs. Typically, business logic resides in the backend to avoid duplication across consumers like web or mobile applications. However, REST’s uniform interface, with its limited vocabulary of CRUD operations (Create, Read, Update, Delete), often distorts this logic. For instance, in a train reservation system, a user’s intent to “search for trains” is reduced to “create a search resource,” stripping away domain-specific semantics like destinations or schedules. This mismatch, Julien argued, stems from REST’s standardized approach, formalized by Roy Fielding in his PhD thesis, which prioritizes simplicity over application-specific needs. As a result, APIs lose expressiveness, forcing consumers to reconstruct business workflows, leading to what Julien termed “accidental complexity of adaptation.”
To illustrate, Julien presented a scenario where a user performs a search for space trains from Earth to the Moon. The traditional REST API translates this into a POST request to create a search resource, devoid of domain context. This not only obscures the user’s intent but also couples consumers to the backend’s implementation, making changes—like switching from “bound” to “journey index” for multi-destination trips—disruptive. Julien’s live demo underscored this fragility: altering a request parameter broke the API, highlighting the risks of tight coupling between consumers and backend models.
Encapsulating Business Intent with Semantic Endpoints
To address these shortcomings, Julien proposed aligning REST endpoints with user actions rather than backend models. Instead of exposing implementation details, such as updating a sub-resource like “selection” within a search, APIs should reflect behaviors like “select a space train with a fare.” This approach involves using classifiers in URLs, such as POST /searches/{id}/spacetrains/{number}/fares/{code}/select
, which clearly convey the intent of selecting a fare for a specific train. Julien emphasized that this does not violate REST principles, debunking the myth that verbs in URLs are forbidden. As long as verbs align with HTTP methods (e.g., POST for creating a resource), they enhance semantic clarity without breaking the uniform interface.
This shift decouples consumers from the backend’s internal structure. For example, changing the backend’s data model (e.g., using booleans instead of a selection object) no longer impacts consumers, as the API exposes behaviors rather than state. Julien’s demo further showcased this by demonstrating how a frontend could adapt to backend changes (e.g., from “bound” to “journey index”) without modification, thanks to semantic endpoints. This approach not only preserves business intent but also simplifies consumer logic, reducing the cognitive load of interpreting CRUD-based APIs.
Encapsulating Workflows with Hypermedia Controls
A critical challenge Julien addressed is the lack of workflow definition in traditional REST APIs. Typically, consumers must hardcode business workflows, such as the sequence of selecting outbound and inbound trains before booking. This leads to duplicated logic and potential errors, like displaying a booking button prematurely. Julien introduced hypermedia controls, specifically HATEOAS (Hypermedia As The Engine Of Application State), as a solution. By embedding links in API responses, the backend can guide consumers through the workflow dynamically.
In his demo, Julien showed how a search response includes links like select-outbound
and all-inbounds
, which guide the consumer to the next valid actions. For instance, after selecting an outbound train, the response provides a link to select an inbound train, ensuring only compatible options are available. This encapsulation of workflow logic in the backend eliminates the need for consumers to understand the sequence of actions, reducing errors and enhancing maintainability. Julien highlighted that this approach, part of the Richardson Maturity Model’s Level 3, makes APIs discoverable and resilient to backend changes, as consumers rely on links rather than hardcoded URLs.
Practical Implementation and Limitations
Julien’s live coding demo brought these concepts to life, showcasing a Spring Boot backend in Kotlin that dynamically generates links based on the application state. For example, the create-booking
link only appears when the selection is complete, ensuring consumers cannot book prematurely. This dynamic guidance, facilitated by Spring HATEOAS, allows the frontend to display UI elements like the booking button based solely on available links, streamlining development and enhancing user experience.
However, Julien acknowledged limitations. For complex forms requiring extensive user input, the hypermedia approach may need supplementation with predefined payloads, as consumers must know what data to send. Additionally, long URLs, while not a practical issue in Julien’s experience at Expedia, could pose challenges in some contexts. Despite these constraints, the approach excels in domains with well-defined workflows, offering a robust framework for building expressive, maintainable APIs.
Conclusion: A New Paradigm for REST APIs
Julien’s session at Devoxx Belgium 2023 offered a transformative vision for REST APIs, emphasizing the power of domain-driven design and hypermedia controls. By aligning endpoints with user actions, encapsulating behaviors, and guiding workflows through links, developers can create APIs that are both semantically rich and resilient to change. This approach not only enhances consumer experience but also aligns with the principles of DDD, ensuring that business intent remains at the forefront of API design. Julien’s practical insights and engaging demo left attendees inspired to rethink their API strategies, fostering a deeper appreciation for REST’s potential when infused with domain-driven principles.
Links:
[DevoxxBE2023] The Panama Dojo: Black Belt Programming with Java 21 and the FFM API by Per Minborg
In an engaging session at Devoxx Belgium 2023, Per Minborg, a Java Core Library team member at Oracle and an OpenJDK contributor, guided attendees through the intricacies of the Foreign Function and Memory (FFM) API, a pivotal component of Project Panama. With a blend of theoretical insights and live coding, Per demonstrated how this API, in its third preview in Java 21, enables seamless interaction with native memory and functions using pure Java code. His talk, dubbed the “Panama Dojo,” showcased the API’s potential to enhance performance and safety, culminating in a hands-on demo of a lightweight microservice framework built with memory segments, arenas, and memory layouts.
Unveiling the FFM API’s Capabilities
Per introduced the FFM API as a solution to the limitations of Java Native Interface (JNI) and direct buffers. Unlike JNI, which requires cumbersome C stubs and inefficient data passing, the FFM API allows direct native memory access and function calls. Per illustrated this with a Point
struct example, where a memory segment models a contiguous memory region with 64-bit addressing, supporting both heap and native segments. This eliminates the 2GB limit of direct buffers, offering greater flexibility and efficiency.
The API introduces memory segments with constraints like size, lifetime, and thread confinement, preventing out-of-bounds access and use-after-free errors. Per highlighted the importance of deterministic deallocation, contrasting Java’s automatic memory management with C’s manual approach. The FFM API’s arenas, such as confined and shared arenas, manage segment lifecycles, ensuring resources are freed explicitly, as demonstrated in a try-with-resources block that deterministically deallocates a segment.
Structuring Memory with Layouts and Arenas
Memory layouts, a key FFM API feature, provide a declarative way to define memory structures, reducing manual offset computations. Per showed how a Point
layout with x
and y
doubles uses var handles to access fields safely, leveraging JIT optimizations for atomic operations. This approach minimizes bugs in complex structs, as var handles inherently account for offsets, unlike manual calculations.
Arenas further enhance safety by grouping segments with shared lifetimes. Per demonstrated a confined arena, restricting access to a single thread, and a shared arena, allowing multi-threaded access with thread-local handshakes for safe closure. These constructs bridge the gap between C’s flexibility and Rust’s safety, offering a balanced model for Java developers. In his live demo, Per used an arena to allocate a MarketInfo
segment, showcasing deterministic deallocation and thread safety.
Building a Persistent Queue with Memory Mapping
The heart of Per’s session was a live coding demo constructing a persistent queue using memory mapping and atomic operations. He defined a MarketInfo
record for stock exchange data, including timestamp, symbol, and price fields. Using a record mapper, Per serialized and deserialized records to and from memory segments, demonstrating immutability and thread safety. The mapper, a potential future JDK feature, simplifies data transfer between Java objects and native memory.
Per then implemented a memory-mapped queue, where a file-backed segment stores headers and payloads. Headers use atomic operations to manage mutual exclusion across threads and JVMs, ensuring safe concurrent access. In the demo, a producer appended MarketInfo
records to the queue, while two consumers read them asynchronously, showcasing low-latency, high-performance data sharing. Per’s use of sparse files allowed a 1MB queue to scale virtually, highlighting the API’s efficiency.
Crafting a Microservice Framework
The session culminated in assembling these components into a microservice framework. Per’s queue, inspired by Chronicle Queue, supports persistent, high-performance data exchange across JVMs. The framework leverages memory mapping for durability, atomic operations for concurrency, and record mappers for clean data modeling. Per demonstrated its practical application by persisting a queue to a file and reading it in a separate JVM, underscoring its robustness for distributed systems.
He emphasized the reusability of these patterns across domains like machine learning and graphics processing, where native libraries are prevalent. Tools like jextract, briefly mentioned, further unlock native libraries like TensorFlow, enabling Java developers to integrate them effortlessly. Per’s framework, though minimal, illustrates how the FFM API can transform Java’s interaction with native code, offering a safer, faster alternative to JNI.
Performance and Safety in Harmony
Throughout, Per stressed the FFM API’s dual focus on performance and safety. Native function calls, faster than JNI, and memory segments with strict constraints outperform direct buffers while preventing common errors. The API’s integration with existing JDK features, like var handles, ensures compatibility and optimization. Per’s live coding, despite its complexity, flowed seamlessly, reinforcing the API’s practicality for real-world applications.
Conclusion: Embracing the Panama Dojo
Per’s session was a masterclass in leveraging the FFM API to push Java’s boundaries. By combining memory segments, layouts, arenas, and atomic operations, he crafted a framework that exemplifies the API’s potential. His call to action—experiment with the FFM API in Java 21—invites developers to explore this transformative tool, promising enhanced performance and safety for native interactions. The Panama Dojo left attendees inspired to break new ground in Java development.
Links:
[DevoxxBE2023] Java Language Update by Brian Goetz
At Devoxx Belgium 2023, Brian Goetz, Oracle’s Java Language Architect, delivered an insightful session on the evolution of Java, weaving together a narrative of recent advancements, current features in preview, and a vision for the language’s future. With his deep expertise, Brian illuminated how Java balances innovation with compatibility, ensuring it remains a cornerstone of modern software development. His talk explored the introduction of records, sealed classes, pattern matching, and emerging features like string templates and simplified program structures, all designed to enhance Java’s expressiveness and accessibility. Through a blend of technical depth and practical examples, Brian showcased Java’s commitment to readable, maintainable code while addressing contemporary programming challenges.
Reflecting on Java’s Recent Evolution
Brian began by recapping Java’s significant strides since his last Devoxx appearance, highlighting features like records, sealed classes, and pattern matching. Records, introduced as nominal tuples, provide a concise way to model data with named components, enhancing readability over structural tuples. For instance, a Point
record with x
and y
coordinates is more intuitive than an anonymous tuple of integers. By deriving constructors, accessors, and equality methods from a state declaration, records eliminate boilerplate while making a clear semantic statement about data immutability. Brian emphasized that this semantic focus, rather than mere syntax reduction, distinguishes Java’s approach from alternatives like Lombok.
Sealed classes, another recent addition, allow developers to restrict class hierarchies, specifying permitted subtypes explicitly. This enables libraries to expose abstract types while controlling implementations, as seen in the JDK’s use of method handles. Sealed classes also enhance exhaustiveness checking in switch statements, reducing runtime errors by ensuring all cases are covered. Brian illustrated this with a Shape
hierarchy, where a sealed interface permits only Circle
and Rectangle
, allowing the compiler to verify switch completeness without a default clause.
Advancing Data Modeling with Pattern Matching
Pattern matching, a cornerstone of Java’s recent enhancements, fuses type testing, casting, and binding into a single operation, reducing errors from manual casts. Brian demonstrated how type patterns, like if (obj instanceof String s)
, streamline code by eliminating redundant casts. Record patterns extend this by deconstructing objects into components, enabling recursive matching for nested structures. For example, a Circle
record with a Point
center can be matched to extract x
and y
coordinates in one expression, enhancing both concision and safety.
The revamped switch construct, now an expression supporting patterns and guards, further leverages these capabilities. Brian highlighted its exhaustiveness checking, which uses sealing information to ensure all cases are handled, as in a Color
interface sealed to Red
, Yellow
, and Green
. This eliminates the need for default clauses, catching errors at compile time if the hierarchy evolves. By combining records, sealed classes, and pattern matching, Java now supports algebraic data types, offering a powerful framework for modeling complex domains like expressions, where a sealed Expression
type can be traversed elegantly with pattern-based recursion.
Introducing String Templates for Safe Aggregation
Looking to the future, Brian introduced string templates, a preview feature addressing the perils of string interpolation. Unlike traditional concatenation or formatting methods, string templates use a template processor to safely combine text fragments and expressions. A syntax like STR.FMT."Hello, \{name\}!"
invokes a processor to validate inputs, preventing issues like SQL injection. Brian envisioned a SQL template processor that balances quotes and produces a result set directly, bypassing string intermediaries for efficiency and security. Similarly, a JSON processor could streamline API development by constructing objects from raw fragments, enhancing performance.
This approach reframes interpolation as a broader aggregation problem, allowing developers to define custom processors for domain-specific needs. Brian’s emphasis on safety and flexibility underscores Java’s commitment to robust APIs, drawing inspiration from JavaScript’s tagged functions and Scala’s string interpolators, but tailored to Java’s ecosystem.
Simplifying Java’s On-Ramp and Beyond
To make Java for new developers, Brian discussed preview features like unnamed classes and patterns, which reduce boilerplate for simple programs. A minimal program might omit public static void main
, allowing beginners to focus on core logic rather than complex object-oriented constructs. This aligns Java with languages like Python, where incremental learning is prioritized, easing the educational burden on instructors and students alike.
Future enhancements include reconstruction patterns for immutable objects, enabling concise updates like p.with(x: 0)
to derive new records from existing ones. Brian also proposed deconstructor patterns for regular classes, mirroring constructors to enable pattern decomposition, enhancing API symmetry. These features aim to make aggregation and decomposition reversible, reducing error-prone asymmetries in object manipulation. For instance, a Person
class could declare a deconstructor to extract first
and last
names, mirroring its constructor, streamlining data handling across Java’s object model.
Conclusion: Java’s Balanced Path Forward
Brian’s session underscored Java’s deliberate evolution, balancing innovation with compatibility. By prioritizing readable, maintainable code, Java addresses modern challenges like loosely coupled services and untyped data, positioning itself as a versatile language for data modeling. Features like string templates and simplified program structures promise greater accessibility, while pattern matching and deconstruction patterns enhance expressiveness. As Java continues to refine its features, it remains a testament to thoughtful design, ensuring developers can build robust, future-ready applications.
Links:
[DevoxxBE2023] A Deep Dive into Advanced TypeScript: A Live Coding Expedition by Christian Wörz
Christian Wörz, a seasoned full-stack engineer and freelancer, captivated the Devoxx Belgium 2023 audience with a hands-on exploration of advanced TypeScript features. Through live coding, Christian illuminated powerful yet underutilized constructs like mapped types, template literals, conditional types, and recursion, demonstrating their practical applications in real-world scenarios. His session, blending technical depth with accessibility, empowered developers to leverage TypeScript’s full potential for creating robust, type-safe codebases.
Mastering Mapped Types and Template Literals
Christian kicked off with mapped types, showcasing their ability to dynamically generate type-safe structures. He defined an Events
type with add
and remove
properties and created an OnEvent
type to prefix event keys with “on” (e.g., onAdd
, onRemove
). Using the keyof
operator and template literal syntax, he ensured that OnEvent
mirrored Events
, enforcing consistency. For instance, adding a move
event to Events
required updating OnEvent
to onMove
, providing compile-time safety. He enhanced this with TypeScript’s intrinsic Capitalize
function to uppercase property names, ensuring precise naming conventions.
Template literals were explored through a chessboard example, where Christian generated all possible positions (e.g., A1
, B2
) by combining letter and number types. He extended this to a CSS validator, defining a GapCSS
type for properties like margin-left
and padding-top
, paired with valid CSS sizes (e.g., rem
, px
). This approach narrowed string types to enforce specific formats, preventing errors like invalid CSS values at compile time.
Leveraging Conditional Types and Never for Safety
Christian delved into conditional types and the never
type to enhance compile-time safety. He introduced a NoEmptyString
type that prevents empty strings from being assigned, using a conditional check to return never
for invalid inputs. Applying this to a failOnEmptyString
function ensured that only non-empty strings were accepted, catching errors before runtime. He also demonstrated exhaustive switch cases, using never
to enforce complete coverage. For a getCountryForLocation
function, assigning an unhandled London
case to a never
-typed variable triggered a compile-time error, ensuring all cases were addressed.
Unraveling Types with Infer and Recursion
The infer
keyword was a highlight, enabling Christian to extract type information dynamically. He created a custom MyReturnType
to mimic TypeScript’s ReturnType
, inferring a function’s return type (e.g., number
for an addition function). This was particularly useful for complex type manipulations. Recursion was showcased through an UnnestArray
type, unwrapping deeply nested arrays to access their inner types (e.g., extracting string
from string[][][]
). He also built a recursive Tuple
type, generating fixed-length arrays with specified element types, such as a three-element RGB tuple, with an accumulator to collect elements during recursion.
Branded Types for Enhanced Type Safety
Christian concluded with branded types, a technique to distinguish specific string formats, like emails, without runtime overhead. By defining an Email
type as a string intersected with an object containing a _brand
property, he ensured that only validated strings could be assigned. A type guard function, isValidEmail
, checked for an @
symbol, allowing safe usage in functions like sendEmail
. This approach maintained the simplicity of primitive types while enforcing strict validation, applicable to formats like UUIDs or custom date strings.
Links:
[DevoxxBE2023] Build a Generative AI App in Project IDX and Firebase by Prakhar Srivastav
At Devoxx Belgium 2023, Prakhar Srivastav, a software engineer at Google, unveiled the power of Project IDX and Firebase in crafting a generative AI mobile application. His session illuminated how developers can harness these tools to streamline full-stack, multiplatform app development directly from the browser, eliminating cumbersome local setups. Through a live demonstration, Prakhar showcased the creation of “Listed,” a Flutter-based app that leverages Google’s PaLM API to break down user-defined goals into actionable subtasks, offering a practical tool for task management. His engaging presentation, enriched with real-time coding, highlighted the synergy of cloud-based development environments and AI-driven solutions.
Introducing Project IDX: A Cloud-Based Development Revolution
Prakhar introduced Project IDX as a transformative cloud-based development environment designed to simplify the creation of multiplatform applications. Unlike traditional setups requiring hefty binaries like Xcode or Android Studio, Project IDX enables developers to work entirely in the browser. Prakhar demonstrated this by running Android and iOS emulators side-by-side within the browser, showcasing a Flutter app that compiles to multiple platforms—Android, iOS, web, Linux, and macOS—from a single codebase. This eliminates the need for platform-specific configurations, making development accessible even on lightweight devices like Chromebooks.
The live demo featured “Listed,” a mobile app where users input a goal, such as preparing for a tech talk, and receive AI-generated subtasks and tips. For instance, entering “give a tech talk at a conference” yielded steps like choosing a relevant topic and practicing the presentation, with a tip to have a backup plan for technical issues. Prakhar’s real-time tweak—changing the app’s color scheme from green to red—illustrated the iterative development flow, where changes are instantly reflected in the emulator, enhancing productivity and experimentation.
Harnessing the PaLM API for Generative AI
Central to the app’s functionality is Google’s PaLM API, which Prakhar utilized to integrate generative AI capabilities. He explained that large language models (LLMs), like those powering the PaLM API, act as sophisticated autocomplete systems, predicting likely text outputs based on extensive training data. For “Listed,” the text API was chosen for its suitability in single-turn interactions, such as generating subtasks from a user’s query. Prakhar emphasized the importance of crafting effective prompts, comparing a vague prompt like “the sky is” to a precise one like “complete the sentence: the sky is,” which yields more relevant results.
To enhance the AI’s output, Prakhar employed few-shot prompting, providing the model with examples of desired responses. For instance, for the query “go camping,” the prompt included sample subtasks like choosing a campsite and packing meals, along with a tip about wildlife safety. This structured approach ensured the model generated contextually accurate and actionable suggestions, making the app intuitive for users tackling complex tasks.
Securing AI Integration with Firebase Extensions
Integrating the PaLM API into a mobile app poses security challenges, particularly around API key exposure. Prakhar addressed this by leveraging Firebase Extensions, which provide pre-packaged solutions to streamline backend integration. Specifically, he used a Firebase Extension to securely call the PaLM API via Cloud Functions, avoiding the need to embed sensitive API keys in the client-side Flutter app. This setup not only enhances security but also simplifies infrastructure management, as the extension handles logging, monitoring, and optional AppCheck for client verification.
In the live demo, Prakhar navigated the Firebase Extensions Marketplace, selecting the “Call PaLM API Securely” extension. With a few clicks, he deployed Cloud Functions that exposed a POST API for sending prompts and receiving AI-generated responses. The code walkthrough revealed a straightforward implementation in Dart, where the app constructs a JSON payload with the prompt, model name (text-bison-001), and temperature (0.25 for deterministic outputs), ensuring seamless and secure communication with the backend.
Building the Flutter App: Simplicity and Collaboration
The Flutter app’s architecture, built within Project IDX, was designed for simplicity and collaboration. Prakhar walked through the main.dart file, which scaffolds the app’s UI with a material-themed interface, an input field for user queries, and a list to display AI-generated tasks. The app uses anonymous Firebase authentication to secure backend calls without requiring user logins, enhancing accessibility. A PromptBuilder class dynamically constructs prompts by combining predefined prefixes and examples, ensuring flexibility in handling varied user inputs.
Project IDX’s integration with Visual Studio Code’s open-source framework added collaborative features. Prakhar demonstrated how developers can invite colleagues to a shared workspace, enabling real-time collaboration. Additionally, the IDE’s AI capabilities allow users to explain selected code or generate new snippets, streamlining development. For instance, selecting the PromptBuilder class and requesting an explanation provided detailed insights into its parameters, showcasing how Project IDX enhances developer productivity.