Option[Scala] != Optional
Java Optional and Scala Option: A Shared Goal, Divergent Philosophies
The absence of a value is one of the most deceptively complex problems in software engineering. For decades, mainstream programming languages relied on a single mechanism to represent it: null. While convenient, this design choice has proven to be one of the most costly abstractions in computing, as famously described by Tony Hoare as his “billion-dollar mistake”. Both Java and Scala eventually introduced explicit abstractions—Optional in Java and Option in Scala—to address this long-standing issue. Although these constructs appear similar on the surface, their design, intended usage, and expressive power differ in ways that reflect the deeper philosophies of their respective languages.
Understanding these differences requires examining not only their APIs, but also how they are used in real code.
Historical Background and Design Motivation
Scala introduced Option as a core concept from its earliest releases. Rooted in functional programming traditions, Scala treats the presence or absence of a value as a fundamental modeling concern. The language encourages developers to encode uncertainty directly into types and to resolve it through composition rather than defensive checks.
Java’s Optional, introduced much later in Java 8, emerged in a very different context. It was part of a cautious modernization effort that added functional elements without breaking compatibility with an enormous existing ecosystem. As a result, Optional was intentionally constrained and positioned primarily as a safer alternative to returning null from methods.
Modeling Presence and Absence
In Scala, an optional value is represented as either Some(value) or None. This is a closed hierarchy, and the distinction is explicit at all times.
def findUser(id: Int): Option[String] =
if (id == 1) Some("Alice") else None
In Java, the equivalent method returns an Optional created through a factory method.
Optional<String> findUser(int id) {
return id == 1 ? Optional.of("Alice") : Optional.empty();
}
At first glance, these examples appear nearly identical. The difference becomes more pronounced in how these values are consumed and composed.
Consumption and Transformation
Scala’s Option integrates deeply with the language’s expression-oriented style. Transformations are natural and idiomatic, and optional values behave much like collections with zero or one element.
val upperName =
findUser(1)
.map(_.toUpperCase)
.filter(_.startsWith("A"))
In this example, absence propagates automatically. If findUser returns None, the entire expression evaluates to None without any additional checks.
Java’s Optional supports similar operations, but the style is more constrained and often more verbose.
Optional<String> upperName =
findUser(1)
.map(String::toUpperCase)
.filter(name -> name.startsWith("A"));
Although the semantics are similar, Java’s syntax and type system make these chains feel more deliberate and less fluid, reinforcing the idea that Optional is a special-purpose construct rather than a universal modeling tool.
Extracting Values: Intentional Friction vs Idiomatic Resolution
Scala encourages developers to resolve optional values through pattern matching or total functions such as getOrElse.
val name = findUser(2) match {
case Some(value) => value
case None => "Unknown"
}
A concise fallback can also be expressed directly:
val name = findUser(2).getOrElse("Unknown")
In Java, extracting a value is intentionally more guarded. While get() exists, its use is discouraged in favor of safer alternatives.
String name = findUser(2).orElse("Unknown");
The difference is cultural rather than technical. In Scala, resolving an Option is a normal part of control flow. In Java, consuming an Optional is treated as an exceptional act that should be handled carefully and explicitly.
Optional Values in Composition
Scala excels at composing multiple optional computations using flatMap or for-comprehensions.
for {
user <- findUser(1)
email <- findEmail(user)
} yield email
This code expresses dependent computations declaratively. If any step yields None, the entire expression evaluates to None.
In Java, the same logic requires more explicit wiring.
Optional<String> email =
findUser(1).flatMap(user -> findEmail(user));
While functional, the Java version becomes less readable as the number of dependent steps increases.
Usage as Fields and Parameters
Scala allows Option to be used freely as a field or parameter type, which is common and idiomatic.
case class User(name: String, email: Option[String])
Java, by contrast, discourages the use of Optional in fields or parameters, even though it is technically possible.
// Generally discouraged
class User {
Optional<String> email;
}
This contrast highlights Scala’s confidence in Option as a foundational abstraction, while Java treats Optional as a boundary marker in API design.
Philosophical Implications
The contrast between Option and Optional mirrors the broader philosophies of Scala and Java. Scala embraces expressive power and abstraction to manage complexity. Java favors incremental evolution and clarity, even when that limits expressiveness.
Both approaches are valid, and both significantly reduce errors when used appropriately.
Conclusion
Java’s Optional and Scala’s Option address the same fundamental problem, yet they do so in ways that reflect the deeper identity of their ecosystems. Scala’s Option is a first-class participant in program structure, encouraging composition and declarative reasoning. Java’s Optional is a carefully scoped enhancement, designed to improve API safety without redefining the language.
What appears to be a minor syntactic distinction is, in reality, a clear illustration of two distinct approaches to software design on the JVM.
[KotlinConf2024] Compose Multiplatform Evolves on iOS and Beyond
At KotlinConf2024, Sebastian Aigner, a JetBrains developer advocate, unveiled advancements in Compose Multiplatform, now in beta for iOS and alpha for web. Extending beyond business logic, Compose enables shared UI across platforms, integrating native capabilities. Sebastian showcased new common APIs—previews, resources, lifecycle, navigation, and UI testing—alongside iOS-specific enhancements like accessibility and scroll physics. Through live demos, he demonstrated how these features simplify cross-platform development, inviting developers to shape Compose’s future with feedback.
A Year of Progress for Compose Multiplatform
Since its debut at KotlinConf2023, Compose Multiplatform has matured significantly. Sebastian highlighted its role in sharing UI code, complementing Kotlin Multiplatform’s business logic sharing. On Android, it leverages Jetpack Compose; on desktop, it powers JetBrains Toolbox; and on iOS, it reached beta status at KotlinConf2024. The web target hit alpha, broadening its reach. Progress spans accessibility, navigation, text input, and scroll physics, with most features now stable or experimental, ready for developers to adopt and refine through real-world use.
iOS-Specific Enhancements
Compose Multiplatform on iOS now feels native, thanks to revamped scroll physics mirroring iOS’s overscroll and spring effects. Sebastian demonstrated accessibility improvements, with components supporting VoiceOver and gesture navigation out of the box, provided content descriptions are added. Interop with SwiftUI allows popups to span the screen, and window insets APIs handle notches and dynamic islands, ensuring full-screen rendering. These enhancements make iOS apps built with Compose visually and functionally indistinguishable from native counterparts, enhancing user experience.
Common Resources for Seamless UI
The new common resources API simplifies asset management. Sebastian showed how to add drawables and strings in a composeResources directory, accessed via a type-safe res object. In a demo, he added a banner image and a localized conference description, with Fleet auto-generating accessors. Support for multimodule resources and translations (e.g., German dark mode) ensures flexibility. This API, familiar from Android, reduces boilerplate, letting developers focus on crafting polished, platform-agnostic UIs with minimal effort.
Lifecycle and View Models in Common Code
Compose Multiplatform now supports common lifecycle and view model APIs, enabling robust app architecture. Sebastian demonstrated a lifecycle logger tracking states like onCreate and onPause, with collectAsStateWithLifecycle ensuring efficient flow collection. In a view model demo, he outsourced mood-tracking logic, using a factory function to instantiate it. Integration with Koin for dependency injection and lifecycle-aware state collection streamlines development, making MVVM patterns viable across platforms without platform-specific workarounds.
Navigation for Cross-Platform Apps
Navigation, a cornerstone of multiplatform apps, is now available via a Jetpack Navigation-inspired API. Sebastian built a demo app with a fruit list and detail pages, using a NavHost and NavController for stack-based navigation. Features like window insets padding, animated transitions, and rememberSaveable for state persistence ensure a native feel. Type-safe routing with Kotlinx.serialization is in development, reducing errors. This API, while optional, simplifies porting Android navigation logic to iOS and beyond, enhancing developer productivity.
UI Testing and Community Feedback
A new common UI testing API allows writing tests once for all platforms. Sebastian showed a test verifying a composable’s text content, executed across targets. This reduces testing overhead, ensuring consistent behavior. He urged developers to try these features, citing the Compose Multiplatform portal (jb.compose) for documentation. Feedback via the Kotlin Slack and issue tracker is vital, as community input drives stabilization. With support for features like strong skipping mode and shared element transitions, Compose continues to evolve dynamically.
Links:
[KotlinConf2024] Compose Multiplatform Evolves on iOS and Beyond
At KotlinConf2024, Sebastian Aigner, a JetBrains developer advocate, unveiled advancements in Compose Multiplatform, now in beta for iOS and alpha for web. Extending beyond business logic, Compose enables shared UI across platforms, integrating native capabilities. Sebastian showcased new common APIs—previews, resources, lifecycle, navigation, and UI testing—alongside iOS-specific enhancements like accessibility and scroll physics. Through live demos, he demonstrated how these features simplify cross-platform development, inviting developers to shape Compose’s future with feedback.
A Year of Progress for Compose Multiplatform
Since its debut at KotlinConf2023, Compose Multiplatform has matured significantly. Sebastian highlighted its role in sharing UI code, complementing Kotlin Multiplatform’s business logic sharing. On Android, it leverages Jetpack Compose; on desktop, it powers JetBrains Toolbox; and on iOS, it reached beta status at KotlinConf2024. The web target hit alpha, broadening its reach. Progress spans accessibility, navigation, text input, and scroll physics, with most features now stable or experimental, ready for developers to adopt and refine through real-world use.
iOS-Specific Enhancements
Compose Multiplatform on iOS now feels native, thanks to revamped scroll physics mirroring iOS’s overscroll and spring effects. Sebastian demonstrated accessibility improvements, with components supporting VoiceOver and gesture navigation out of the box, provided content descriptions are added. Interop with SwiftUI allows popups to span the screen, and window insets APIs handle notches and dynamic islands, ensuring full-screen rendering. These enhancements make iOS apps built with Compose visually and functionally indistinguishable from native counterparts, enhancing user experience.
Common Resources for Seamless UI
The new common resources API simplifies asset management. Sebastian showed how to add drawables and strings in a composeResources directory, accessed via a type-safe res object. In a demo, he added a banner image and a localized conference description, with Fleet auto-generating accessors. Support for multimodule resources and translations (e.g., German dark mode) ensures flexibility. This API, familiar from Android, reduces boilerplate, letting developers focus on crafting polished, platform-agnostic UIs with minimal effort.
Lifecycle and View Models in Common Code
Compose Multiplatform now supports common lifecycle and view model APIs, enabling robust app architecture. Sebastian demonstrated a lifecycle logger tracking states like onCreate and onPause, with collectAsStateWithLifecycle ensuring efficient flow collection. In a view model demo, he outsourced mood-tracking logic, using a factory function to instantiate it. Integration with Koin for dependency injection and lifecycle-aware state collection streamlines development, making MVVM patterns viable across platforms without platform-specific workarounds.
Navigation for Cross-Platform Apps
Navigation, a cornerstone of multiplatform apps, is now available via a Jetpack Navigation-inspired API. Sebastian built a demo app with a fruit list and detail pages, using a NavHost and NavController for stack-based navigation. Features like window insets padding, animated transitions, and rememberSaveable for state persistence ensure a native feel. Type-safe routing with Kotlinx.serialization is in development, reducing errors. This API, while optional, simplifies porting Android navigation logic to iOS and beyond, enhancing developer productivity.
UI Testing and Community Feedback
A new common UI testing API allows writing tests once for all platforms. Sebastian showed a test verifying a composable’s text content, executed across targets. This reduces testing overhead, ensuring consistent behavior. He urged developers to try these features, citing the Compose Multiplatform portal (jb.compose) for documentation. Feedback via the Kotlin Slack and issue tracker is vital, as community input drives stabilization. With support for features like strong skipping mode and shared element transitions, Compose continues to evolve dynamically.
Links:
[DevoxxGR2025] Optimized Kubernetes Scaling with Karpenter
Alex König, an AWS expert, delivered a 39-minute talk at Devoxx Greece 2025, exploring how Karpenter enhances Kubernetes cluster autoscaling for speed, cost-efficiency, and availability.
Karpenter’s Dynamic Autoscaling
König introduced Karpenter as an open-source, Kubernetes-native autoscaling solution, contrasting it with the traditional Cluster Autoscaler. Unlike the latter, which relies on uniform node groups (e.g., nodes with four CPUs and 16GB RAM), Karpenter uses the EC2 Fleet API to dynamically provision nodes tailored to workload needs. For instance, if a pod requires one CPU, Karpenter allocates a node with minimal excess capacity, avoiding resource waste. This right-sizing, combined with groupless scaling, enables faster and more cost-effective scaling, especially in dynamic environments.
Ensuring Availability with Constraints
König addressed availability challenges reported by users, emphasizing Kubernetes-native scheduling constraints to mitigate disruptions. Topology spread constraints distribute pods across availability zones, reducing the risk of downtime if a node fails. Pod disruption budgets, affinity/anti-affinity rules, and priority classes further ensure critical workloads are scheduled appropriately. For stateful workloads using EBS, König recommended setting the volume binding mode to “wait for first consumer” to avoid pod-volume mismatches across zones, preventing crashes and ensuring reliability.
Integrating with KEDA for Application Scaling
For advanced scaling, König highlighted combining Karpenter with KEDA for event-driven, application-specific scaling. KEDA scales pods based on metrics like Kafka topic sizes or SQS queues, beyond CPU/memory. Karpenter then provisions nodes for pending pods, enabling seamless scaling for workloads like flash sales. König outlined a four-step migration from Cluster Autoscaler to Karpenter, emphasizing its simplicity and open-source documentation.
Links
[GoogleIO2025] What’s new in Jetpack Compose
Keynote Speaker
Jolanda Verhoef serves as a Developer Relations Engineer at Google, specializing in Android development with a focus on Jetpack Compose and user interface tooling. Based in Utrecht, she advocates for modern UI practices, drawing from her education at the University of Utrecht to educate developers on building efficient, adaptive applications.
Abstract
This scholarly exploration delves into the recent enhancements within Jetpack Compose, Google’s declarative UI framework for Android, emphasizing features that bolster developer efficiency, runtime optimization, and library extensibility. It scrutinizes novel APIs for autofill, text scaling, and visibility monitoring, alongside performance upgrades and stability refinements, elucidating their design rationales, integration techniques, and potential influences on application architecture. Through code illustrations and case analyses, the narrative reveals how these advancements facilitate the creation of resilient, cross-platform interfaces, fostering accelerated development cycles in contemporary mobile ecosystems.
Innovations in Features and Usability
Jolanda Verhoef opens by reflecting on Jetpack Compose’s trajectory since its inception as an experimental toolkit in 2019, evolving into the premier recommendation for Android UI construction. She asserts that its adoption, now encompassing 60% of top-tier applications, stems from its capacity to expedite development while yielding aesthetically pleasing, responsive interfaces. This growth contextualizes within Android’s maturation, where Compose addresses the demand for tools that prioritize user-centric innovations over legacy constraints.
A cornerstone update is autofill integration, enabling seamless population of form fields with pre-stored user data. Verhoef explains that implementation necessitated a comprehensive overhaul of Compose’s semantics infrastructure to align with system-level autofill services. In practice, developers apply a semantics modifier to text fields, specifying roles such as username or password via a content type property. This methodology not only enhances accessibility but also streamlines user interactions, reducing friction in authentication flows.
Code sample for basic autofill:
TextField(
value = username,
onValueChange = { username = it },
modifier = Modifier.semantics {
contentType = AutofillType.Username
}
)
For alpha releases, a dedicated contentType modifier simplifies this to a single line, illustrating Compose’s commitment to concise, expressive APIs. Implications include improved retention through effortless onboarding, though developers must consider privacy implications in data handling.
Autosizing text emerges as another usability boon, automatically adjusting font dimensions to fit containers. By appending an autoSize parameter to Text composables, with configurable minima, maxima, and step granularities, layouts dynamically adapt without manual interventions. This innovation mitigates overflow issues in variable screen environments, such as foldables, promoting inclusivity across device spectra.
The animateBounds modifier facilitates concurrent animation of size and position within lookahead scopes, optimizing for fluid transitions in adaptive UIs. Verhoef highlights its utility in scenarios like content resizing during orientation shifts, where traditional animations might falter.
Visibility tracking receives low-level support via onLayoutRectChanged, a performant callback for monitoring composable positions relative to roots, windows, or screens. Superior to onGloballyPositioned due to inherent throttling and debouncing, it suits high-frequency tasks like scroll-based analytics. Alpha extensions, including onVisibilityChanged for viewport entry/exit detection and onVisibilityFractionChanged for partial exposure ratios, elevate this to higher abstractions. These enable sophisticated features like auto-pausing videos or lazy loading, with implications for battery efficiency and data conservation in media-heavy apps.
Methodologically, these features leverage Compose’s recomposition model, where UI reacts to state changes without imperative redraws. Contextually, they respond to developer feedback for streamlined tooling, implying broader adoption by reducing barriers to advanced functionalities.
Enhancements in Performance, Stability, and Ecosystem Libraries
Verhoef transitions to performance optimizations, underscoring Compose’s maturation through rigorous benchmarking. Compiler advancements, including stable skipping for non-recomposable lambdas, halve recomposition times in benchmarks, enhancing responsiveness in complex hierarchies. UI toolkit refinements, such as deferred subcomposition and optimized modifier chains, yield 20-30% frame rate gains in scrolling lists.
Stability efforts involve a 32% reduction in experimental APIs via deprecations and stabilizations, with core modules like Foundation and UI achieving 66% cuts. This bolsters confidence in production deployments, mitigating migration risks.
Ecosystem expansion integrates Compose with broader Jetpack suites. Navigation 3, an alpha artifact, reimagines routing with Compose idioms, offering layered architectures for adaptive, customizable flows across form factors, including XR. It supports transitions, predictive back navigation, and Material Design, with full backstack control for bespoke needs.
Media and camera libraries receive Compose-native building blocks via Media3 and CameraX, eschewing view wrappers for granular control. In Androidify, a tutorial video employs PlayerSurface for rendering and custom play-pause states, demonstrating modular composition.
Code sample for media playback:
VideoPlayer(player = player) {
PlayerSurface(player = player)
MyPlayPauseButton(player = player)
}
These libraries empower tailored experiences, implying versatile media integrations without vendor lock-in.
Overall, these enhancements contextualize within Android’s push for unified, efficient development. Implications span accelerated prototyping, reduced maintenance, and enriched user engagements, positioning Compose as indispensable for future-proof Android endeavors.
Links:
The Circuit Breaker Pattern: Engineering Resilience in Distributed Systems
Modern software systems are rarely monolithic. They are composed of services that communicate over networks, depend on external APIs, and share infrastructure that may degrade or fail unpredictably. In such environments, failures are not exceptional events; they are an expected operational reality. The Circuit Breaker pattern addresses this reality by introducing a controlled mechanism for detecting failures, isolating unstable components, and preventing cascading breakdowns across the system.
Originally popularized by Michael Nygard in Release It!, the Circuit Breaker pattern has since become a cornerstone of resilient system design, particularly in microservice architectures.
The Problem: Failure Amplification in Distributed Systems
In a tightly coupled synchronous system, a slow or failing dependency can propagate failure far beyond its original scope. When a downstream service becomes unavailable, upstream services may continue to send requests, each consuming threads, memory, and connection pools while waiting for timeouts. Under sufficient load, this behavior leads to resource exhaustion, degraded latency, and eventually system-wide failure.
The critical observation behind the Circuit Breaker pattern is that continuing to call a failing service is often worse than failing fast. The goal is therefore not to eliminate failure, but to contain it.
Core Concept of the Circuit Breaker
The Circuit Breaker pattern borrows its metaphor directly from electrical engineering. Just as an electrical circuit breaker interrupts current to prevent damage, a software circuit breaker interrupts calls to a failing dependency to protect the rest of the system.
At its core, a circuit breaker monitors the outcome of calls to a remote operation and transitions between well-defined states based on observed behavior.
Conceptual State Model
[CLOSED] ---> failures exceed threshold ---> [OPEN]
^ |
| |
+---- successful trial calls <--- [HALF-OPEN]
When the circuit is closed, requests flow normally and failures are counted. Once failures exceed a configured threshold, the circuit transitions to open, immediately rejecting calls without attempting remote execution. After a defined wait period, the breaker enters half-open, allowing a limited number of trial requests. Based on their outcome, the circuit either closes again or reopens.
Design Goals and Architectural Implications
The Circuit Breaker pattern serves several architectural objectives simultaneously. It reduces load on failing services, protects shared resources such as thread pools, improves overall system responsiveness by failing fast, and provides clear operational signals about the health of dependencies.
Equally important, it makes failure explicit and observable, allowing architects to reason about degraded modes rather than assuming perfect availability.
Circuit Breaker in Java with Resilience4j
Resilience4j is a lightweight, modular fault-tolerance library designed for Java 8 and later. It avoids heavyweight runtime dependencies and integrates cleanly with functional programming constructs.
Configuration and Behavior
CircuitBreakerConfig config = CircuitBreakerConfig.custom()
.failureRateThreshold(50)
.slidingWindowSize(10)
.waitDurationInOpenState(Duration.ofSeconds(30))
.permittedNumberOfCallsInHalfOpenState(3)
.build();
CircuitBreaker circuitBreaker =
CircuitBreaker.of("paymentService", config);
This configuration expresses architectural intent clearly: tolerate occasional failure, but react decisively when instability persists.
Protecting a Remote Call
Supplier<String> decoratedCall =
CircuitBreaker.decorateSupplier(
circuitBreaker,
() -> paymentClient.process()
);
Try<String> result = Try.ofSupplier(decoratedCall)
.recover(ex -> "fallback response");
When the circuit is open, the call is rejected immediately and fallback logic is triggered without consuming remote resources.
Observability and Metrics
Resilience4j exposes events and metrics for all state transitions, enabling seamless integration with monitoring and alerting systems.
Circuit Breaker in Scala with Akka
In the Scala ecosystem, the most commonly used circuit breaker implementation is provided by Akka. It is designed for asynchronous, non-blocking execution and aligns naturally with Scala’s functional concurrency model.
Defining a Circuit Breaker
import akka.pattern.CircuitBreaker
import scala.concurrent.duration._
val breaker = new CircuitBreaker(
scheduler = system.scheduler,
maxFailures = 5,
callTimeout = 2.seconds,
resetTimeout = 30.seconds
)
Guarding an Asynchronous Operation
val protectedCall =
breaker.withCircuitBreaker {
paymentClient.process()
}
If the circuit is open, the future fails immediately. If half-open, execution is conditionally allowed.
protectedCall.recover {
case _: CircuitBreakerOpenException =>
"fallback response"
}
This approach integrates naturally with Scala’s standard error-handling and composition patterns.
Comparing the Java and Scala Approaches
While Resilience4j and Akka implement the same pattern, their APIs reflect different language philosophies. Resilience4j emphasizes functional decoration and explicit configuration, whereas Akka embeds circuit breaking deeply into asynchronous workflows.
Despite these differences, both approaches deliver the same guarantees: controlled failure detection, fast rejection, and measured recovery.
Circuit Breakers and System Design
A circuit breaker is not a substitute for retries, timeouts, or bulkheads. Instead, it coordinates these mechanisms by enforcing system-wide discipline when dependencies fail.
From an architectural standpoint, circuit breakers encourage designers to plan explicitly for degraded modes and partial availability, rather than assuming ideal conditions.
Conclusion
The Circuit Breaker pattern is a pragmatic response to the inherent unreliability of distributed systems. By formalizing failure detection and response, it transforms unpredictable outages into managed and observable events.
Whether implemented with Resilience4j in Java or Akka in Scala, the circuit breaker remains a foundational pattern for building systems that remain stable, transparent, and trustworthy under stress.
Client
Service A
Circuit Breaker
Service B
request
execute()
call (if CLOSED / HALF-OPEN) response or error response or fallback
update circuit state
[DotAI2024] DotAI 2024: Johannes Dienst – Charting the Course to Intention-Led Orchestration
Johannes Dienst, Developer Advocate at AskUI—a beacon in vision-visionary automation—and evangelist for excellence in engineering annals, sketched a blueprint at DotAI 2024. Drawing from Iron Man’s iconic invocation—Stark summoning JARVIS—Dienst demystified digital deputies: charis conjured through contemporary craft, intent pilots propelling prompts to praxis. His vignette vivified a research revelation: UI’s unbound, agents actuated via acuity, from poem’s poesy on paperwork’s plight to procedural prowess.
Envisioning Agents: From Conceptual Charis to Visionary Voyages
Dienst’s dawn: 2008’s cinematic summons, JARVIS as lab’s lieutenant—conversing, commanding. Fast-forward: fledgling forerunners, intent’s ignition—”craft a verse on bureaucratic burdens”—cascading through cognition: Chrome’s chronicle, Docs’ domain, document’s dawn—hands-off harmony, humanity’s hallmark.
Tech’s trinity: grounding’s grasp—UI’s unadorned, annotated auras—bounding boxes bespeaking buttons, labels limning locales. Generalists glean: “engage element 58″—trials transcended. Humongo’s homage: self-operating savants’ summons—prompts parsed, possibilities pruned.
Control’s crux: human-like handiwork—mouse’s maneuver, keyboard’s keystroke—sans bespoke bridges, universality unlocked. Dienst’s diagram: JSON’s jewel, structured surges—enter’s echo, executor’s enactment—PyAutoGUI’s proxy, operational osmosis.
Forging Forward: Embedded Engines for Expansive Empowerment
Dienst deepened the delve: libraries as launchpads, services as sentinels—OS’s oracle, omnipotent operator—JARVIS’ jurisdiction, jars of autonomy. Intent’s issuance: structured schemas scripting sequences, fulfillment’s fiat—pauses pondered, poems procured.
GitHub’s granary: open-source odyssey, intent pilot’s inheritance—harness, hone, herald. Dienst’s decree: devise deputies—charis as companions, visions vivified—beyond binaries, building’s bliss.
In illumination, Dienst ignited: inception’s intent, implementation’s impetus—craft charis, conquer cosmos.
Links:
[DotJs2025] Durable Executions for Mortals
Backend’s bedrock—state’s stewardship, asynchrony’s aegis—once consigned coders to queues’ quagmires, yet React’s reactive rite reimagines this realm. Charly Poly, developer marketer at Inngest, advocated durable executions at dotJS 2025, transmuting frontend’s fluency into fault-tolerant flows. A frontend aficionado attuned to async’s arcana, Charly posited workflows as web’s warp: events’ echoes, states’ sagas—sans system’s scutwork.
Charly’s chronicle commenced with React’s renaissance: beyond templates’ tapestry, a triad taming temporality—events’ ingress, data’s domicile, UI’s unison. Backend’s ballad parallels: requests’ reception, persistence’s peril, orchestration’s odyssey. Inngest’s insight: functions as filaments, durable by decree—stepwise sagas, state salved, failures finessed. TypeScript’s temperance: inngest.createFunction({steps: ['ship', 'email']}), waits weaving webhooks—shipment’s vigil, seven-day sentinel.
This tapestry tempers toil: throttling’s thrum, rate’s restraint—web’s whims writ large. Charly contrasted: Temporal’s toils versus Inngest’s intimacy—events’ essence, JS’s jocularity. AI’s affinity: RAG’s relays, agents’ arcs—workflows as warp and weft.
Durable’s dividend: devs’ deliverance—frontend’s flair fortifying backends, sans queues’ quandary.
React’s Reactive Roots
Charly canvassed React’s remit: events’ embrace, fetches’ flux, states’ serenity—templating’s triumph. Backend’s burden: ingress’ influx, persistence’s pang—orchestration’s odyssey.
Inngest’s Immutable Flows
Functions’ filaments: steps’ sequence, waits’ watch—webhooks’ whisper, shipment’s sojourn. TypeScript’s tether: throttling’s tie, AI’s arc—RAG’s relay, agents’ agency.
Links:
[SpringIO2025] Real-World AI Patterns with Spring AI and Vaadin by Marcus Hellberg / Thomas Vitale
Lecturer
Marcus Hellberg is the Vice President of AI Research at Vaadin, a company specializing in tools for Java developers to build web applications. As a Java Champion with nearly 20 years of experience in Java and web development, he focuses on integrating AI capabilities into Java ecosystems. Thomas Vitale is a software engineer at Systematic, a Danish software company, with expertise in cloud-native solutions, Java, and AI. He is the author of “Cloud Native Spring in Action” and an upcoming book on developer experience on Kubernetes, and serves as a CNCF Ambassador.
- Marcus Hellberg on LinkedIn
- Marcus Hellberg on GitHub
- Thomas Vitale on LinkedIn
- Thomas Vitale on GitHub
Abstract
This article examines practical patterns for incorporating artificial intelligence into Java applications using Spring AI and Vaadin, transitioning from experimental to production-ready implementations. It analyzes techniques for memory management, guardrails, multimodality, retrieval-augmented generation, tool calling, and agents, with implications for security, user experience, and system integration. Insights emphasize robust, observable AI workflows in on-premises or cloud environments.
Memory Management and Streaming in AI Interactions
Integrating large language models (LLMs) into applications requires addressing their stateless nature, where each interaction lacks inherent context from prior exchanges. Spring AI provides advisors—interceptor-like mechanisms—to augment prompts with conversation history, enabling short-term memory. For instance, a MessageChatMemoryAdvisor retains the last N messages, ensuring continuity without manual tracking.
This pattern enhances user interactions in chat-based interfaces, built here with Vaadin’s component model for server-side Java UIs. A vertical layout hosts message lists and inputs, injecting a ChatClientBuilder to construct clients with advisors. Basic interactions involve prompting the model and appending responses, but for realism, streaming via reactive fluxes improves responsiveness, subscribing to token streams and updating UI progressively.
Code illustration:
ChatClient chatClient = builder.build();
messageInput.addSubmitListener(submitEvent -> {
String message = submitEvent.getMessage();
MessageItem userItem = messageList.addMessage("You", message);
chatClient.stream(new Prompt(message))
.subscribe(response -> {
userItem.append(response.getResult().getOutput().getContent());
});
});
Streaming suits verbose responses, reducing perceived latency, while observability integrations (e.g., OpenTelemetry) trace interactions for debugging nondeterministic behaviors.
Guardrails for Security and Validation
AI workflows must mitigate risks like sensitive data leaks or invalid outputs. Input guardrails intercept prompts, using on-premises models to check for compliance with policies, blocking unauthorized queries (e.g., personal information). Output guardrails validate responses, reprompting for corrections if deserialization fails.
Advisors enable this: a default advisor with a local chat model filters inputs/outputs. For example, querying an address might be blocked if flagged, preventing cloud exposure. This ensures determinism in structured outputs, converting unstructured text to Java objects via JSON instructions.
Implications include privacy preservation in regulated sectors and integration with Spring Security for role-based tool access.
Multimodality and Retrieval-Augmented Generation
LLMs extend beyond text through multimodality, processing images, audio, or videos. Spring AI’s entity methods augment prompts for structured extraction, e.g., parsing attendee details from images into tables for programmatic use.
Retrieval-augmented generation (RAG) combats hallucinations by embedding external data as vectors in stores like PostgreSQL. A RetrievalAugmentationAdvisor retrieves relevant documents via similarity search, augmenting prompts. Customizations allow empty contexts for fallback to model knowledge.
Example:
VectorStore vectorStore = // PostgreSQL vector store
DocumentRetriever retriever = new VectorStoreDocumentRetriever(vectorStore);
RetrievalAugmentationAdvisor advisor = RetrievalAugmentationAdvisor.builder()
.documentRetriever(retriever)
.queryAugmentor(QueryAugmentor.contextual().allowEmptyContext(true))
.build();
This pattern grounds responses in proprietary data, with thresholds controlling retrieval scope.
Tool Calling, Agents, and Dynamic Integrations
Tool calling empowers LLMs as agents, invoking external functions for tasks like database queries. Annotations describe tools, passed to clients for dynamic selection. For products, a service might expose query/update methods:
@Tool(description = "Fetch products from database")
public List<Product> getProducts(@P(description = "Category filter") String category) {
// Database query
}
Agents orchestrate tools, potentially via Model Context Protocol for external services. Demonstrations include theme generation from screenshots, editing CSS via file system tools, highlighting nondeterminism and the need for safeguards.
In conclusion, these patterns enable production AI, emphasizing modularity, security, and observability for robust Java applications.
Links:
[DevoxxBE2025] Live Coding The Hive: Building a Microservices-Ready Modular Monolith
Lecturer
Thomas Pierrain is Vice President of Engineering at Agicap, a financial management platform, where he applies domain-driven design to build scalable systems. Julien Topcu is Vice President of Technology at SHODO Group, a consultancy focused on socio-technical coaching and architecture, with expertise in helping teams implement domain-driven practices.
Abstract
This analysis investigates the Hive pattern, an architectural approach for creating modular monoliths that support easy evolution to microservices. It identifies key ideas like vertical slicing and port-adapter boundaries, set against the backdrop of microservices pitfalls. Highlighting a live-refactored time-travel system, it details methods for domain alignment, encapsulation, and simulated distributed communication. Consequences for system flexibility, debt management, and scalability are evaluated, providing insights into resilient designs for existing and new developments.
Emergence from Microservices Challenges
Over a decade, the shift to microservices has often resulted in distributed messes, worse than the monoliths they replaced due to added complexity in coordination and deployment. The modular monolith concept arises as a remedy, but risks tight coupling if not properly segmented. The Hive addresses this by separating design from deployment, following “construct once, deploy flexibly.”
In the live example, a time-machine’s control system—handling energy, navigation, and diagnostics—crashes due to fragility, landing in the 1980s. Diagnostics reveal a muddled structure with high resource use, mirroring legacy systems burdened by modeling debt—the buildup of imprecise domain models hindering change.
The pattern’s innovation lies in fractal composability: modules as hexagons can nest or extract as services. This enables scaling in (sub-modules) or out (microservices), adapting to needs like independent deployment for high-load components.
Essential Tenets of the Hive
Vertical slicing packs modules with all necessities—logic, storage, interfaces—for self-sufficiency, avoiding shared layers’ dependencies. In the demo, the energy module includes its database, isolating it from navigation.
Port-adapter encapsulation defines interaction points: inbound for incoming, outbound for outgoing. Adapters translate, eliminating direct links. The navigation’s energy request port uses an adapter to call the energy’s provision port, preventing tangles.
Inter-module talks mimic microservices sans networks, using in-process events. This readies for distribution: swapping adapters for remote calls extracts modules seamlessly. The example routes via a bus, allowing monolith operation with distributed readiness.
These tenets create a supple framework, resilient to evolution. The fractal aspect allows infinite composition, as shown by nesting diagnostics within navigation.
Refactoring Methodology and Practical Steps
The session starts with a monolithic system showing instability: overused resources cause anomalies. AI schemas expose entanglements, guiding domain identification—energy, time circuits, AI.
Modules reorganize: each hexagon sliced vertically with dedicated storage. Code moves via IDE tools, databases split to prevent sharing. Energy gains PostgreSQL, queried through adapters.
Communication restructures: ports define contracts, adapters implement. Navigation’s outbound energy port adapts to energy’s inbound, using events for asynchrony.
Extraction demonstrates: energy becomes a microservice by changing adapters to network-based, deploying separately without core changes. Tests modularize similarly, using mocks for isolation.
This step-by-step approach handles brownfields incrementally, using tools for safe restructuring.
Resilience, Scalability, and Debt Mitigation
Hive’s boundaries enhance resilience: changes localize, as energy tweaks affect only its hexagon. This curbs debt, allowing independent domain refinement.
Scalability is fractal: inward nesting subdivides, outward extraction distributes. Networkless talks ease transitions, minimizing rewrites.
Versus monoliths’ coupling or microservices’ prematurity, Hive balances, domain-focused for “right-sized” architectures. Challenges: upfront refactoring, boundary discipline.
Development Ramifications and Adoption
Hive promotes adaptive designs for changing businesses. Starting modular prevents debt in new projects; modernizes legacies via paths shown.
Wider effects: better sustainment, lower costs through contained modules. As hype fades, Hive provides hybrids, emphasizing appropriate sizing.
Future: broader use in frameworks, tools for pattern enforcement.
In overview, Hive exemplifies composable resilience, merging monolith unity with microservices adaptability.
Links:
- Lecture video: https://www.youtube.com/watch?v=VKcRNtj0tzc
- Thomas Pierrain on LinkedIn: https://fr.linkedin.com/in/thomas-p-0664769
- Thomas Pierrain on Twitter/X: https://twitter.com/tpierrain
- Julien Topcu on LinkedIn: https://fr.linkedin.com/in/julien-top%25C3%25A7u
- Agicap website: https://agicap.com/
- SHODO Group website: https://shodo.io/