Archive for the ‘en-US’ Category
[DevoxxBE2023] Moving Java Forward Together: Community Power
Sharat Chander, Oracle’s Senior Director of Java Developer Engagement, delivered a compelling session at DevoxxBE2023, emphasizing the Java community’s pivotal role in driving the language’s evolution. With over 25 years in the IT industry, Sharat’s passion for Java and community engagement shone through as he outlined how developers can contribute to Java’s future, ensuring its relevance for decades to come.
The Legacy and Longevity of Java
Sharat began by reflecting on Java’s 28-year journey, a testament to its enduring impact on software development. He engaged the audience with a poll, revealing the diverse experience levels among attendees, from those using Java for five years to veterans with over 25 years of expertise. This diversity underscores Java’s broad adoption across industries, from small startups to large enterprises.
Java’s success, Sharat argued, stems from its thoughtful innovation strategy. Unlike the “move fast and break things” mantra, the Java team prioritizes stability and backward compatibility, ensuring that applications built on older versions remain functional. Projects like Amber, Panama, and the recent introduction of virtual threads in Java 21 exemplify this incremental yet impactful approach to innovation.
Balancing Stability and Progress
Sharat addressed the tension between rapid innovation and maintaining stability, a challenge given Java’s extensive history. He highlighted the six-month release cadence introduced to reduce latency to innovation, allowing developers to adopt new features without waiting for major releases. This approach, likened to a train arriving every three minutes, minimizes disruption and enhances accessibility.
The Java team’s commitment to trust, innovation, and predictability guides its development process. Sharat emphasized that Java’s design principles—established 28 years ago—continue to shape its evolution, ensuring it meets the needs of diverse applications, from AI and big data to emerging fields like quantum computing.
Community as the Heart of Java
The core of Sharat’s message was the community’s role in Java’s vitality. He debunked the “build it and they will come” myth, stressing that Java’s success relies on active community participation. Programs like the OpenJDK project invite developers to engage with mailing lists, review code check-ins, and contribute to technical decisions, fostering transparency and collaboration.
Sharat also highlighted foundational programs like the Java Community Process (JCP) and Java Champions, who advocate for Java independently, providing critical feedback to the Java team. He encouraged attendees to join Java User Groups (JUGs), noting the nearly 400 groups worldwide as vital hubs for knowledge sharing and networking.
Digital Engagement and Future Initiatives
Recognizing the digital era’s impact, Sharat discussed Oracle’s efforts to reach Java’s 10 million developers through platforms like dev.java. This portal aggregates learning resources, community content, and programs like JEEP Cafe and Sip of Java, which offer digestible insights into Java’s features. The recently launched Java Playground provides a browser-based environment for experimenting with code snippets, accelerating feature adoption.
Sharat also announced the community contributions initiative on dev.java, featuring content from Java Champions like Venkat Subramaniam and Hannes Kutz. This platform aims to showcase community expertise, encouraging developers to submit their best practices via GitHub pull requests.
Nurturing Diversity and Inclusion
A poignant moment in Sharat’s talk was his call for greater gender diversity in the Java community. He acknowledged the industry’s shortcomings in achieving balanced representation and urged collective action to expand the community’s mindshare. Programs like JDuchess aim to create inclusive spaces, ensuring Java’s evolution benefits from diverse perspectives.
Links:
[NodeCongress2023] The Road to Async Context: Standardizing Contextual Data Tracking in Asynchronous JavaScript
Lecturer
James M Snell
James M Snell is a Systems Engineer on the Cloudflare Workers team. He is a highly influential figure in the JavaScript runtime space, serving as a core contributor to the Node.js project and a member of the Node.js Technical Steering Committee (TSC). His work focuses on driving the adoption of web-compatible standard APIs across diverse JavaScript runtime environments, including Node.js, Deno, and Cloudflare Workers. Before his current role, he spent 16 years working on open technologies and standards at IBM. He is actively involved in the Web-interoperable Runtimes Community Group (WinterCG).
- Institutional Profile/Professional Page: jasnell.me
- X (Twitter): @jasnell
- Organization: Cloudflare Workers
Abstract
This article examines the evolution and standardization efforts surrounding Async Context in JavaScript runtimes, transitioning from Node.js’s AsyncLocalStorage to the proposed AsyncContext API within the TC-39 standards body. The analysis defines the core problem of tracking contextual data across asynchronous boundaries and explains the mechanism by which AsyncContext provides a deterministic, reliable way to manage this state, which is vital for modern diagnostic, security, and feature management tools. The article highlights the methodology of the Web-interoperable Runtimes Community Group (WinterCG) in establishing a portable subset of this API for immediate cross-runtime compatibility.
Context: The Asynchronous State Problem
In a synchronous programming environment, state—such as user identity, transaction ID, or locale settings—is managed within a thread’s local memory (thread-local storage). However, modern JavaScript runtimes operate on a single thread with a shared event loop, where a single incoming request often forks into multiple asynchronous operations (I/O, network calls, timers) that execute non-sequentially. The fundamental challenge is maintaining this contextual information reliably across these asynchronous function boundaries. Traditional solutions, like passing context through function arguments, are impractical and violate encapsulation.
Methodology and Mechanisms
Async Context Tracking
The core concept of Async Context (first implemented as AsyncLocalStorage in Node.js) involves a model that links contextual information to the asynchronous flow of execution.
- Asynchronous Resource Stack: Context tracking is achieved by building a stack of “asynchronous resources”. When an asynchronous operation (e.g., a promise, a timer, or an HTTP request) begins, a new entry is added to this conceptual stack.
- The
runMethod: The primary public API for setting context is therunmethod, which executes a function within a new, dedicated context frame. Any asynchronous work initiated within this function will inherit that context.
The Move to AsyncContext
The standardization effort in TC-39 aims to introduce AsyncContext as a native language feature, replacing runtime-specific APIs like AsyncLocalStorage. The key difference in the future AsyncContext model is a move towards immutability within the context frame, specifically deprecating mutable methods like enter and exit that Node.js historically experimented with in AsyncResource. The consensus is to maintain the determinism and integrity of the context by requiring a new frame to be created for any changes, thus making the context “immutable across asynchronous boundaries”.
Consequences and Interoperability
The implications of standardized Async Context are significant, primarily for Observability and Cross-Runtime Compatibility:
- Observability (Diagnostics): The context mechanism is critical for application performance monitoring (APM) and diagnostics. It allows an instrument to reliably attach a request ID, correlation ID, or span data to every operation performed during the lifecycle of a single incoming request, which is essential for distributed tracing.
- Runtime Interoperability: The Web-interoperable Runtimes Community Group (WinterCG) is actively defining a “portable subset” of the
AsyncLocalStorageAPI. This subset is designed to be compatible with the forthcomingAsyncContextstandard and is being implemented across multiple runtimes (Node.js, Cloudflare Workers, Deno, Bun) in advance. This collective effort is paving the way for truly portable JavaScript code, where contextual state management is a reliable, universal primitive.
Conclusion
The standardization of Async Context represents a pivotal development in the maturity of server-side JavaScript. By integrating a reliable mechanism for tracking contextual state across asynchronous flows, the community is solving a long-standing architectural complexity. The collaboration within WinterCG ensures that this critical feature is implemented uniformly, fostering a more robust, standards-compliant, and portable ecosystem for all major JavaScript runtimes.
Relevant links and hashtags
- Lecture Video: The Road to Async Context – James M Snell, Node Congress 2023
- Lecturer Professional Links:
- Professional Page: jasnell.me
- X (Twitter): @jasnell
- Organization: Cloudflare Workers
Hashtags: #AsyncContext #AsyncLocalStorage #TC39 #WinterCG #NodeJS #CloudflareWorkers #JavaScriptStandards #Observability #NodeCongress
[PHPForumParis2022] Internet and Geopolitics – Stéphane Bortzmeyer
Stéphane Bortzmeyer, a seasoned expert in Internet infrastructure, delivered a thought-provoking presentation at PHP Forum Paris 2022, exploring the intricate relationship between the Internet and geopolitics. Dedicated to Mahsa Amini, whose tragic death sparked a revolt in Iran, Stéphane’s talk illuminated how global politics shape the Internet’s architecture and accessibility. With a focus on real-world examples, he challenged developers to consider the geopolitical implications of their work, particularly in the context of PHP-driven applications that rely on global connectivity.
The Myth of the Cloud
Stéphane began by debunking the term “cloud,” criticizing it as a marketing ploy that obscures the Internet’s physical and political realities. Far from being a borderless entity, the Internet is deeply tied to national jurisdictions, data centers, and political decisions. He highlighted how regimes, such as Iran’s, manipulate Internet access to suppress dissent, using censorship and throttling to control information flow. For PHP developers, this underscores the importance of designing applications that account for such restrictions, ensuring accessibility and resilience in diverse geopolitical contexts.
Global Connectivity Challenges
Delving into connectivity, Stéphane explained how historical and economic factors create disparities in Internet access. He cited Colombia, where local content is often hosted abroad due to limited domestic infrastructure, creating a self-reinforcing cycle of dependency on foreign connections. This phenomenon, driven by business interests rather than technical necessity, affects latency and accessibility for PHP applications. Stéphane urged developers to advocate for local interconnection and consider hosting strategies that prioritize user experience in underserved regions.
The Role of Regulation and Advocacy
Addressing regulatory responses, Stéphane contrasted Bolivia’s legal mandates for local interconnection with Europe’s reliance on user and government pressure. He emphasized that developers can influence these dynamics by supporting open Internet standards and advocating for equitable access. For PHP developers, this means building applications that are adaptable to varying network conditions and contributing to discussions on infrastructure policies. Stéphane’s insights highlighted the developer’s role in fostering a more inclusive and resilient Internet.
Empowering Developers to Act
Concluding, Stéphane called on developers to engage with the geopolitical aspects of their work, from choosing hosting providers to supporting open standards. By understanding the Internet’s physical and political constraints, PHP developers can create applications that better serve global audiences. His dedication to Mahsa Amini served as a poignant reminder of the stakes involved, inspiring attendees to approach their craft with a broader perspective on its societal impact.
Links:
[PHPForumParis2023] PhpStorm = <3 – Charles Desneuf
At Forum PHP 2023, Charles Desneuf, a freelance developer, architect, and tech coach, delivered an enthusiastic exploration of PhpStorm, an integrated development environment (IDE) that he credits with transforming his coding experience. With a blend of technical insight and genuine passion, Charles shared how PhpStorm enhances productivity and brings joy to development workflows. His talk, rich with practical demonstrations, highlighted the IDE’s features, from automated testing to live templates, inspiring PHP developers to optimize their tools for efficiency and creativity.
Enhancing Productivity with PhpStorm
Charles opened by emphasizing PhpStorm’s role in boosting developer productivity. He described how the IDE’s robust features, such as intelligent code completion and real-time error detection, allow developers to write and modify code with confidence. Charles demonstrated how PhpStorm’s integration with testing frameworks enables seamless execution of unit tests directly within the IDE, streamlining workflows. By sharing his personal journey, he underscored how these tools eliminate friction, allowing developers to focus on crafting high-quality PHP applications.
The Joy of Playful Coding
Beyond efficiency, Charles highlighted the playful aspect of using PhpStorm, likening coding to a game. He showcased live templates, which enable developers to create reusable code snippets, significantly speeding up repetitive tasks. For example, Charles illustrated how a custom template could generate a test method with a few keystrokes, transforming mundane tasks into engaging challenges. His infectious enthusiasm encouraged attendees to explore PhpStorm’s features creatively, fostering a sense of enjoyment in their daily work.
Practical Applications and Community Sharing
Charles concluded by encouraging developers to experiment with PhpStorm’s features, such as automated refactoring and test integration, to enhance their workflows. He shared a personal website where he distributes his live templates, inviting the community to contribute their own. This collaborative spirit, combined with his call to action for developers to reflect on their tool usage, inspired attendees to adopt PhpStorm or optimize their existing setups, ensuring both productivity and satisfaction in their coding endeavors.
[DevoxxBE2023] Making Your @Beans Intelligent: Spring AI Innovations
At DevoxxBE2023, Dr. Mark Pollack delivered an insightful presentation on integrating artificial intelligence into Java applications using Spring AI, a project inspired by advancements in AI frameworks like LangChain and LlamaIndex. Mark, a seasoned Spring developer since 2003 and leader of the Spring Data project, explored how Java developers can harness pre-trained AI models to create intelligent applications that address real-world challenges. His talk introduced the audience to Spring AI’s capabilities, from simple “Hello World” examples to sophisticated use cases like question-and-answer systems over custom documents.
The Genesis of Spring AI
Mark began by sharing his journey into AI, sparked by the transformative impact of ChatGPT. Unlike traditional AI development, which often required extensive data cleaning and model training, pre-trained models like those from OpenAI offer accessible APIs and vast knowledge bases, enabling developers to focus on application engineering rather than data science. Mark highlighted how Spring AI emerged from his exploration of code generation, leveraging the structured nature of code within these models to create a framework tailored for Java developers. This framework abstracts the complexity of AI model interactions, making it easier to integrate AI into Spring-based applications.
Spring AI draws inspiration from Python’s AI ecosystem but adapts these concepts to Java’s idioms, emphasizing component abstractions and pluggability. Mark emphasized that this is not a direct port but a reimagination, aligning with the Spring ecosystem’s strengths in enterprise integration and batch processing. This approach positions Spring AI as a bridge between Java’s robust software engineering practices and the dynamic world of AI.
Core Components of AI Applications
A significant portion of Mark’s presentation focused on the architecture of AI applications, which extends beyond merely calling a model. He introduced a conceptual framework involving contextual data, AI frameworks, and models. Contextual data, akin to ETL (Extract, Transform, Load) processes, involves parsing and transforming data—such as PDFs—into embeddings stored in vector databases. These embeddings enable efficient similarity searches, crucial for use cases like question-and-answer systems.
Mark demonstrated a simple AI client in Spring AI, which abstracts interactions with various AI models, including OpenAI, Hugging Face, Amazon Bedrock, and Google Vertex. This portability allows developers to switch models without significant code changes. He also showcased the Spring CLI, a tool inspired by JavaScript’s Create React App, which simplifies project setup by generating starter code from existing repositories.
Prompt Engineering and Its Importance
Prompt engineering emerged as a critical theme in Mark’s talk. He explained that crafting effective prompts is essential for directing AI models to produce desired outputs, such as JSON-formatted responses or specific styles of answers. Spring AI’s PromptTemplate class facilitates this by allowing developers to create reusable, stateful templates with placeholders for dynamic content. Mark illustrated this with a demo where a prompt template generated a joke about a raccoon, highlighting the importance of roles (system and user) in defining the context and tone of AI responses.
He also touched on the concept of “dogfooding,” where AI models are used to refine prompts, creating a feedback loop that enhances their effectiveness. This iterative process, combined with evaluation techniques, ensures that applications deliver accurate and relevant responses, addressing challenges like model hallucinations—where AI generates plausible but incorrect information.
Retrieval Augmented Generation (RAG)
Mark introduced Retrieval Augmented Generation (RAG), a technique to overcome the limitations of AI models’ context windows, which restrict the amount of data they can process. RAG involves pre-processing data into smaller fragments, converting them into embeddings, and storing them in vector databases for similarity searches. This approach allows developers to provide only relevant data to the model, improving efficiency and accuracy.
In a demo, Mark showcased RAG with a bicycle shop dataset, where a question about city-commuting bikes retrieved relevant product descriptions from a vector store. This process mirrors traditional search engines but leverages AI to synthesize answers, demonstrating how Spring AI integrates with vector databases like Milvus and PostgreSQL to handle complex queries.
Real-World Applications and Future Directions
Mark highlighted practical applications of Spring AI, such as enabling question-and-answer systems for financial documents, medical records, or government programs like Medicaid. These use cases illustrate AI’s potential to make complex information more accessible, particularly for non-technical users. He also discussed the importance of evaluation in AI development, advocating for automated scoring mechanisms to assess response quality beyond simple test passing.
Looking forward, Mark outlined Spring AI’s roadmap, emphasizing robust core abstractions and support for a growing number of models and vector databases. He encouraged developers to explore the project’s GitHub repository and participate in its evolution, underscoring the rapid pace of AI advancements and the need for community involvement.
Links:
[DevoxxUK2024] Processing XML with Kafka Connect by Dale Lane
Dale Lane, a seasoned developer at IBM with a deep focus on event-driven architectures, delivered a compelling session at DevoxxUK2024, unveiling a powerful Kafka Connect plugin designed to streamline XML data processing. With extensive experience in Apache Kafka and Flink, Dale addressed the challenges of integrating XML data into Kafka pipelines, a task often fraught with complexity due to the format’s incompatibility with Kafka’s native data structures like Avro or JSON. His presentation offers practical solutions for developers seeking to bridge external systems with Kafka, transforming XML into more manageable formats or generating XML outputs for legacy systems. Through clear examples, Dale illustrates how this open-source plugin enhances flexibility and efficiency in Kafka Connect pipelines, empowering developers to handle diverse data integration scenarios with ease.
Understanding Kafka Connect Pipelines
Dale begins by demystifying Kafka Connect, a robust framework for moving data between Kafka and external systems. He outlines two primary pipeline types: source pipelines, which import data from external systems into Kafka, and sink pipelines, which export Kafka data to external destinations. A source pipeline typically involves a connector to fetch data, optional transformations to modify or filter it, and a converter to serialize the data into formats like Avro or JSON for Kafka topics. Conversely, a sink pipeline starts with a converter to deserialize Kafka data, followed by transformations and a connector to deliver it to an external system. This foundational explanation sets the stage for understanding where and how XML processing fits into these workflows, ensuring developers grasp the pipeline’s modular structure before diving into specific use cases.
Converting XML for Kafka Integration
A common challenge Dale addresses is integrating XML data from external systems, such as IBM MQ or XML-based web services, into Kafka’s ecosystem, which favors structured formats. He introduces the Kafka Connect plugin, available on GitHub under an Apache license, as a solution to parse XML into structured records early in the pipeline. For instance, using an IBM MQ source connector, the plugin can transform XML documents from a message queue into a generic structured format, allowing subsequent transformations and serialization into JSON or Avro. Dale demonstrates this with a weather API that returns XML strings, showing how the plugin converts these into structured objects for further processing, making them compatible with Kafka tools that struggle with raw XML. This approach significantly enhances the usability of external data within Kafka’s ecosystem.
Generating XML Outputs from Kafka
For scenarios where external systems require XML, Dale showcases the plugin’s ability to convert Kafka’s JSON or Avro messages into XML strings within a sink pipeline. He provides an example using a Kafka topic with JSON messages destined for an IBM MQ system, where the plugin, integrated as part of the sink connector, transforms structured data into XML before delivery. Another case involves an HTTP sink connector posting to an XML-based web service, such as an XML-RPC API. Here, the pipeline deserializes JSON, applies transformations to align with the API’s payload requirements, and uses the plugin to produce an XML string. This flexibility ensures seamless communication with legacy systems, bridging modern Kafka workflows with traditional XML-based infrastructure.
Enhancing Pipelines with Schema Support
Dale emphasizes the plugin’s schema handling capabilities, which add robustness to XML processing. In source pipelines, the plugin can reference an external XSD schema to validate and structure XML data, which is then paired with an Avro converter to submit schemas to a registry, ensuring compatibility with Kafka’s schema-driven ecosystem. In sink pipelines, enabling schema inclusion generates an XSD alongside the XML output, providing a clear description of the data’s structure. Dale illustrates this with a stock price connector, where enabling schema support produces XML events with accompanying XSDs, enhancing interoperability. This feature is particularly valuable for maintaining data integrity across systems, making the plugin a versatile tool for complex integration tasks.
Links:
[SpringIO2023] Beyond Routing: Spring Cloud Gateway with Style by Abel Salgado & Marta Medio
At Spring I/O 2023, Abel Salgado and Marta Medio, seasoned engineers from VMware, delivered an engaging session on elevating Spring Cloud Gateway beyond basic routing. Through practical, real-world examples and live demos, they showcased how to customize Spring Cloud Gateway with filters to address complex enterprise needs, such as security integrations and response modifications. Their talk demystified the gateway’s reactive stack, offering tips to avoid pitfalls and encouraging developers to harness its flexibility for performant, modular solutions.
Understanding Spring Cloud Gateway’s Power
Abel and Marta began by positioning Spring Cloud Gateway as a reactive, high-performance component that sits between clients and upstream services, enabling request and response manipulation. Unlike a one-to-one gateway tutorial, their session assumed basic familiarity, focusing on advanced customizations. They highlighted the gateway’s filter-based architecture, where filters operate in a chain, processing requests and responses in a defined order. This modularity allows developers to inject cross-cutting concerns like security or logging without altering upstream services. The duo emphasized the gateway’s potential to modernize legacy systems or split monoliths, making it a strategic tool for enterprise architectures.
Custom Filters for Security Integration
A key demo showcased how to implement OAuth security using Spring Cloud Gateway filters, integrating with Okta for token validation. Abel demonstrated two filters: one validating OAuth tokens using Spring Security’s resource server configuration, and another extracting claims (e.g., username) from the token to forward as HTTP headers to an upstream service unaware of OAuth. This approach isolates security logic in the gateway, sparing upstream services from modification. The demo, scripted with HTTPbin as the upstream service, illustrated a real-world scenario where a client authenticates with Okta, sends a token to the gateway, and receives a response with enriched headers. Abel stressed avoiding manual token validation, leveraging Spring Security’s reactive components to ensure performance and maintainability.
Dynamic Configuration and Response Modification
Marta explored the gateway’s configuration flexibility, demonstrating how to parametrize filters dynamically via YAML. A simple filter adding request headers was enhanced to accept multiple header configurations, reducing the need for multiple filter instances and improving performance. In a more complex example, Marta tackled modifying JSON responses from an upstream service, a common requirement when legacy APIs cannot be altered. Using a custom converter and the ModifyResponseBodyFilter, she transformed JSON fields based on YAML-defined key-value pairs, processing the response once to optimize performance. Marta cautioned about the performance risks of response modification, urging developers to carefully design configurations for scalability.
Best Practices and Community Insights
Abel and Marta concluded with practical advice for Spring Cloud Gateway adoption. They advocated for modular filter design to encapsulate features, enabling easy composition via YAML or actuator endpoints for runtime updates. They warned against reinventing solutions, citing cases where developers manually parsed JWTs instead of using Spring Security. The duo recommended reactive Spring libraries to avoid blocking issues and highlighted the gateway’s testing challenges, pointing to their GitHub repository for code examples. A nod to Spencer Gibb’s announcement about exploring MVC support for Spring Cloud Gateway underscored the project’s evolving potential. Their session inspired attendees to experiment with the gateway’s capabilities, leveraging its documentation and community resources.
Links:
[DevoxxBE2023] Securing the Supply Chain for Your Java Applications by Thomas Vitale
At Devoxx Belgium 2023, Thomas Vitale, a software engineer and architect at Systematic, delivered an authoritative session on securing the software supply chain for Java applications. As the author of Cloud Native Spring in Action and a passionate advocate for cloud-native technologies, Thomas provided a comprehensive exploration of securing every stage of the software lifecycle, from source code to deployment. Drawing on the SLSA framework and CNCF research, he demonstrated practical techniques for ensuring integrity, authenticity, and resilience using open-source tools like Gradle, Sigstore, and Kyverno. Through a blend of theoretical insights and live demonstrations, Thomas illuminated the critical importance of supply chain security in today’s threat landscape.
Safeguarding Source Code with Git Signatures
Thomas began by defining the software supply chain as the end-to-end process of delivering software, encompassing code, dependencies, tools, practices, and people. He emphasized the risks at each stage, starting with source code. Using Git as an example, Thomas highlighted its audit trail capabilities but cautioned that commit authorship can be manipulated. In a live demo, he showed how he could impersonate a colleague by altering Git’s username and email, underscoring the need for signed commits. By enforcing signed commits with GPG or SSH keys—or preferably a keyless approach via GitHub’s single sign-on—developers can ensure commit authenticity, establishing a verifiable provenance trail critical for supply chain security.
Managing Dependencies with Software Bills of Materials (SBOMs)
Moving to dependencies, Thomas stressed the importance of knowing exactly what libraries are included in a project, especially given vulnerabilities like Log4j. He introduced Software Bills of Materials (SBOMs) as a standardized inventory of software components, akin to a list of ingredients. Using the CycloneDX plugin for Gradle, Thomas demonstrated generating an SBOM during the build process, which provides precise dependency details, including versions, licenses, and hashes for integrity verification. This approach, integrated into Maven or Gradle, ensures accuracy over post-build scanning tools like Snyk, enabling developers to identify vulnerabilities, check license compliance, and verify component integrity before production.
Thomas further showcased Dependency-Track, an OWASP project, to analyze SBOMs and flag vulnerabilities, such as a critical issue in SnakeYAML. He introduced the Vulnerability Exploitability Exchange (VEX) standard, which complements SBOMs by documenting whether vulnerabilities affect an application. In his demo, Thomas marked a SnakeYAML vulnerability as a false positive due to Spring Boot’s safe deserialization, demonstrating how VEX communicates security decisions to stakeholders, reducing unnecessary alerts and ensuring compliance with emerging regulations.
Building Secure Artifacts with Reproducible Builds
The build phase, Thomas explained, is another critical juncture for security. Using Spring Boot as an example, he outlined three packaging methods: JAR files, native executables, and container images. He critiqued Dockerfiles for introducing non-determinism and maintenance overhead, advocating for Cloud Native Buildpacks as a reproducible, secure alternative. In a demo, Thomas built a container image with Buildpacks, highlighting its fixed creation timestamp (January 1, 1980) to ensure identical outputs for unchanged inputs, enhancing security by eliminating variability. This reproducibility, coupled with SBOM generation during the build, ensures artifacts are both secure and traceable.
Signing and Verifying Artifacts with SLSA
To ensure artifact integrity, Thomas introduced the SLSA framework, which provides guidelines for securing software artifacts across the supply chain. He demonstrated signing container images with Sigstore’s Cosign tool, using a keyless approach to avoid managing private keys. This process, integrated into a GitHub Actions pipeline, ensures that artifacts are authentically linked to their creator. Thomas further showcased SLSA’s provenance generation, which documents the artifact’s origin, including the Git commit hash and build steps. By achieving SLSA Level 3, his pipeline provided non-falsifiable provenance, ensuring traceability from source code to deployment.
Securing Deployments with Policy Enforcement
The final stage, deployment, requires validating artifacts to ensure they meet security standards. Thomas demonstrated using Cosign and the SLSA Verifier to validate signatures and provenance, ensuring only trusted artifacts are deployed. On Kubernetes, he introduced Kyverno, a policy engine that enforces signature and provenance checks, automatically rejecting non-compliant deployments. This approach ensures that production environments remain secure, aligning with the principle of validating metadata to prevent unauthorized or tampered artifacts from running.
Conclusion: A Holistic Approach to Supply Chain Security
Thomas’s session at Devoxx Belgium 2023 provided a robust framework for securing Java application supply chains. By addressing source code integrity, dependency management, build reproducibility, artifact signing, and deployment validation, he offered a comprehensive strategy to mitigate risks. His practical demonstrations, grounded in open-source tools and standards like SLSA and VEX, empowered developers to adopt these practices without overwhelming complexity. Thomas’s emphasis on asking “why” at each step encouraged attendees to tailor security measures to their context, ensuring both compliance and resilience in an increasingly regulated landscape.
Links:
[KotlinConf2023] Java and Kotlin: A Mutual Evolution
At KotlinConf2024, John Pampuch, Google’s production languages lead, delivered a history lesson on Java and Kotlin’s intertwined journeys. Battling jet lag with humor, John traced nearly three decades of Java and twelve years of Kotlin, emphasizing their complementary strengths. From Java’s robust ecosystem to Kotlin’s pragmatic innovation, the languages have shaped each other, accelerating progress. John’s talk, rooted in his experience since Java’s 1996 debut, explored design goals, feature cross-pollination, and future implications, urging developers to leverage Kotlin’s developer-friendly features while appreciating Java’s stability.
Design Philosophies: Pragmatism Meets Robustness
John opened by contrasting the languages’ origins. Java, launched in 1995, aimed for simplicity, security, and portability, aligning tightly with the JVM and JDK. Its ecosystem, bolstered by libraries and tooling, set a standard for enterprise development. Kotlin, announced in 2011 by JetBrains, prioritized pragmatism: concise syntax, interoperability with Java, and multiplatform flexibility. Unlike Java’s JVM dependency, Kotlin targets iOS, web, and beyond, enabling faster feature rollouts. John noted Kotlin’s design avoids Java’s rigidity, embracing object-oriented principles with practical tweaks like semicolon-free lines. Yet Java’s self-consistency, seen in its holistic lambda integration, complements Kotlin’s adaptability, creating a synergy where both thrive.
Feature Evolution: From Lambdas to Coroutines
The talk highlighted key milestones. Java’s 2014 release of JDK 8 introduced lambdas, default methods, and type inference, transforming APIs to support functional programming. Kotlin, with 1.0 in 2016, brought smart casts, string templates, and named arguments, prioritizing developer ease. By 2018, Kotlin’s coroutines revolutionized JVM asynchronous programming, offering a simpler mental model than Java’s threads. John praised coroutines as a potential game-changer, though Java’s 2023 virtual threads and structured concurrency aim to close the gap. Kotlin’s multiplatform support, cemented by Google’s 2017 Android endorsement, outpaces Java’s JVM-centric approach, but Java’s predictable six-month release cycle since 2017 ensures steady progress. These advancements reflect a race where each language pushes the other forward.
Mutual Influences: Sealed Classes and Beyond
John emphasized cross-pollination. Java’s 2021 records, inspired by frameworks like Lombok, mirror Kotlin’s data classes, though Kotlin’s named parameters reduce boilerplate further. Sealed classes, introduced in Java 17 and Kotlin 1.5 around 2021, emerged concurrently, suggesting shared inspiration. Kotlin’s string templates, a staple since its early days, influenced Java’s 2024 preview of flexible string templates, which John hopes Kotlin might adopt for localization. Java’s exploration of nullability annotations, potentially aligning with Kotlin’s robust null safety, shows ongoing convergence. John speculated that community demand could push Java toward features like named arguments, though JVM changes remain a hurdle. This mutual learning, fueled by competition with languages like Go and Rust, drives excitement and innovation.
Looking Ahead: Pragmatism and Compatibility
John concluded with a call to action: embrace Kotlin’s compact, readable features while valuing Java’s compile-time speed and ecosystem. Kotlin’s faster feature delivery and multiplatform prowess contrast with Java’s backwards compatibility and predictability. Yet both share a commitment to pragmatic evolution, avoiding breaks in millions of applications. Questions from the audience probed Java’s nullability and virtual threads, with John optimistic about eventual alignment but cautious about timelines. His talk underscored that Java and Kotlin’s competition isn’t zero-sum—it’s a catalyst for better tools, ideas, and developer experiences, ensuring both languages remain vital.
Links:
Hashtags: #Java #Kotlin
[DevoxxBE2023] REST Next Level: Crafting Domain-Driven Web APIs by Julien Topçu
At Devoxx Belgium 2023, Julien Topçu, a technical coach at Shadow, delivered a compelling session on elevating REST APIs by embedding domain-driven design principles. With a rich background in crafting software using Domain-Driven Design (DDD), Extreme Programming, and Kanban, Julien illuminated the pitfalls of traditional REST implementations and proposed a transformative approach to encapsulate business intent within APIs. His talk, centered around a fictional space travel booking system, demonstrated how to align APIs with user actions, preserve business workflows, and enhance consumer experience through hypermedia controls. Through a blend of theoretical insights and practical demonstrations, Julien showcased a methodology to create APIs that are not only functional but also semantically rich and workflow-driven.
The Pitfalls of Traditional REST APIs
Julien began by highlighting a pervasive issue in software architecture: the loss of business intent when translating domain logic into REST APIs. Typically, business logic resides in the backend to avoid duplication across consumers like web or mobile applications. However, REST’s uniform interface, with its limited vocabulary of CRUD operations (Create, Read, Update, Delete), often distorts this logic. For instance, in a train reservation system, a user’s intent to “search for trains” is reduced to “create a search resource,” stripping away domain-specific semantics like destinations or schedules. This mismatch, Julien argued, stems from REST’s standardized approach, formalized by Roy Fielding in his PhD thesis, which prioritizes simplicity over application-specific needs. As a result, APIs lose expressiveness, forcing consumers to reconstruct business workflows, leading to what Julien termed “accidental complexity of adaptation.”
To illustrate, Julien presented a scenario where a user performs a search for space trains from Earth to the Moon. The traditional REST API translates this into a POST request to create a search resource, devoid of domain context. This not only obscures the user’s intent but also couples consumers to the backend’s implementation, making changes—like switching from “bound” to “journey index” for multi-destination trips—disruptive. Julien’s live demo underscored this fragility: altering a request parameter broke the API, highlighting the risks of tight coupling between consumers and backend models.
Encapsulating Business Intent with Semantic Endpoints
To address these shortcomings, Julien proposed aligning REST endpoints with user actions rather than backend models. Instead of exposing implementation details, such as updating a sub-resource like “selection” within a search, APIs should reflect behaviors like “select a space train with a fare.” This approach involves using classifiers in URLs, such as POST /searches/{id}/spacetrains/{number}/fares/{code}/select, which clearly convey the intent of selecting a fare for a specific train. Julien emphasized that this does not violate REST principles, debunking the myth that verbs in URLs are forbidden. As long as verbs align with HTTP methods (e.g., POST for creating a resource), they enhance semantic clarity without breaking the uniform interface.
This shift decouples consumers from the backend’s internal structure. For example, changing the backend’s data model (e.g., using booleans instead of a selection object) no longer impacts consumers, as the API exposes behaviors rather than state. Julien’s demo further showcased this by demonstrating how a frontend could adapt to backend changes (e.g., from “bound” to “journey index”) without modification, thanks to semantic endpoints. This approach not only preserves business intent but also simplifies consumer logic, reducing the cognitive load of interpreting CRUD-based APIs.
Encapsulating Workflows with Hypermedia Controls
A critical challenge Julien addressed is the lack of workflow definition in traditional REST APIs. Typically, consumers must hardcode business workflows, such as the sequence of selecting outbound and inbound trains before booking. This leads to duplicated logic and potential errors, like displaying a booking button prematurely. Julien introduced hypermedia controls, specifically HATEOAS (Hypermedia As The Engine Of Application State), as a solution. By embedding links in API responses, the backend can guide consumers through the workflow dynamically.
In his demo, Julien showed how a search response includes links like select-outbound and all-inbounds, which guide the consumer to the next valid actions. For instance, after selecting an outbound train, the response provides a link to select an inbound train, ensuring only compatible options are available. This encapsulation of workflow logic in the backend eliminates the need for consumers to understand the sequence of actions, reducing errors and enhancing maintainability. Julien highlighted that this approach, part of the Richardson Maturity Model’s Level 3, makes APIs discoverable and resilient to backend changes, as consumers rely on links rather than hardcoded URLs.
Practical Implementation and Limitations
Julien’s live coding demo brought these concepts to life, showcasing a Spring Boot backend in Kotlin that dynamically generates links based on the application state. For example, the create-booking link only appears when the selection is complete, ensuring consumers cannot book prematurely. This dynamic guidance, facilitated by Spring HATEOAS, allows the frontend to display UI elements like the booking button based solely on available links, streamlining development and enhancing user experience.
However, Julien acknowledged limitations. For complex forms requiring extensive user input, the hypermedia approach may need supplementation with predefined payloads, as consumers must know what data to send. Additionally, long URLs, while not a practical issue in Julien’s experience at Expedia, could pose challenges in some contexts. Despite these constraints, the approach excels in domains with well-defined workflows, offering a robust framework for building expressive, maintainable APIs.
Conclusion: A New Paradigm for REST APIs
Julien’s session at Devoxx Belgium 2023 offered a transformative vision for REST APIs, emphasizing the power of domain-driven design and hypermedia controls. By aligning endpoints with user actions, encapsulating behaviors, and guiding workflows through links, developers can create APIs that are both semantically rich and resilient to change. This approach not only enhances consumer experience but also aligns with the principles of DDD, ensuring that business intent remains at the forefront of API design. Julien’s practical insights and engaging demo left attendees inspired to rethink their API strategies, fostering a deeper appreciation for REST’s potential when infused with domain-driven principles.