Posts Tagged ‘AOT’
[DevoxxBE2024] Project Leyden: Improving Java’s Startup Time by Per Minborg, Sébastien Deleuze
Per Minborg and Sébastien Deleuze delivered an insightful joint presentation at Devoxx Belgium 2024, unveiling the transformative potential of Project Leyden to enhance Java application startup time, warmup, and footprint. Per, from Oracle’s Java Core Library team, and Sébastien, a Spring Framework core committer at Broadcom, explored how Leyden shifts computation across time to optimize performance. Despite minor demo hiccups, such as Wi-Fi-related delays, their talk combined technical depth with practical demonstrations, showcasing how Spring Boot 3.3 leverages Leyden’s advancements, cutting startup times significantly and paving the way for future Java optimizations.
Understanding Project Leyden’s Mission
Project Leyden, an open-source initiative under OpenJDK, aims to address long-standing Java performance challenges: startup time, warmup time, and memory footprint. Per explained startup as the duration from launching a program to its first useful operation, like displaying “Hello World” or serving a Spring app’s initial request. Warmup, conversely, is the time to reach peak performance via JIT compilation. Leyden’s approach involves shifting computations earlier (e.g., at build time) or later (e.g., via lazy initialization) while preserving Java’s dynamic nature. Unlike GraalVM Native Image or Project CRaC, which sacrifice dynamism for speed, Leyden maintains compatibility, allowing developers to balance performance and flexibility.
Class Data Sharing (CDS) and AOT Cache: Today’s Solutions
Per introduced Class Data Sharing (CDS), a feature available since JDK 5, and its evolution into the Ahead-of-Time (AOT) Cache, a cornerstone of Leyden’s strategy. CDS preloads JDK classes, while AppCDS, introduced in JDK 10, extends this to application classes. The AOT Cache, an upcoming enhancement, stores class objects, resolved linkages, and method profiles, enabling near-instant startup. Sébastien demonstrated this with a Spring Boot Pet Clinic application, reducing startup from 3.2 seconds to 800 milliseconds using CDS and AOT Cache. The process involves a training run to generate the cache, which is then reused for faster deployments, though it requires consistent JVM and classpath configurations.
Spring Boot’s Synergy with Leyden
Sébastien highlighted the collaboration between the Spring and Leyden teams, initiated after a 2023 JVM Language Summit case study. Spring Boot 3.3 introduces features to simplify CDS and AOT Cache usage, such as extracting executable JARs into a CDS-friendly layout. A demo showed how a single command extracts the JAR, runs a training phase, and generates a cache, which is then embedded in a container image. This reduced startup times by up to 4x and memory usage by 20% when combined with Spring’s AOT optimizations. Sébastien also demonstrated how AOT Cache retains JIT “warmness,” enabling near-peak performance from startup, though a minor performance plateau gap is being addressed.
Future Horizons and Trade-offs
Looking ahead, Leyden plans to introduce stable values, a hybrid between mutable and immutable fields, offering final-like performance with flexible initialization. Per emphasized that Leyden avoids the heavy constraints of GraalVM (e.g., limited reflection) or CRaC (e.g., Linux-only, security concerns with serialized secrets). While CRaC achieves millisecond startups, its lifecycle complexities and security risks limit adoption. Leyden’s AOT Cache, conversely, offers significant gains (2–4x faster startups) with minimal constraints, making it ideal for most use cases. Developers can experiment with Leyden’s early access builds to optimize their applications, with further enhancements like code cache storage on the horizon.
Links:
Hashtags: #ProjectLeyden #Java #SpringBoot #AOTCache #CDS #StartupTime #JVM #DevoxxBE2024 #PerMinborg #SébastienDeleuze
[DevoxxFR 2024] Going AOT: Mastering GraalVM for Java Applications
Alina Yurenko 🇺🇦 , a developer advocate at Oracle Labs, captivated audiences at Devoxx France 2024 with her deep dive into GraalVM’s ahead-of-time (AOT) compilation for Java applications. With a passion for open-source and community engagement, Alina explored how GraalVM’s Native Image transforms Java applications into compact, high-performance native executables, ideal for cloud environments. Through demos and practical guidance, she addressed building, testing, and optimizing GraalVM applications, debunking myths and showcasing its potential. This post unpacks Alina’s insights, offering a roadmap for adopting GraalVM in production.
GraalVM and Native Image Fundamentals
Alina introduced GraalVM as both a high-performance JDK and a platform for AOT compilation via Native Image. Unlike traditional JVMs, GraalVM allows developers to run Java applications conventionally or compile them into standalone native executables that don’t require a JVM at runtime. This dual capability, built on over a decade of research at Oracle Labs, offers Java’s developer productivity alongside native performance benefits like faster startup and lower resource usage. Native Image, GA since 2019, analyzes an application’s bytecode at build time, identifying reachable code and dependencies to produce a compact executable, eliminating unused code and pre-populating the heap for instant startup.
The closed-world assumption underpins this process: all application behavior must be known at build time, unlike the JVM’s dynamic runtime optimizations. This enables aggressive optimizations but requires careful handling of dynamic features like reflection. Alina demonstrated this with a Spring Boot application, which started in 1.3 seconds on GraalVM’s JVM but just 47 milliseconds as a native executable, highlighting its suitability for serverless and microservices where startup speed is critical.
Benefits Beyond Startup Speed
While fast startup is a hallmark of Native Image, Alina emphasized its broader advantages, especially for long-running applications. By shifting compilation, class loading, and optimization to build time, Native Image reduces runtime CPU and memory usage, offering predictable performance without the JVM’s warm-up phase. A Spring Pet Clinic benchmark showed Native Image matching or slightly surpassing the JVM’s C2 compiler in peak throughput, a testament to two years of optimization efforts. For memory-constrained environments, Native Image excels, delivering up to 2–3x higher throughput per memory unit at heap sizes of 512MB to 1GB, as seen in throughput density charts.
Security is another benefit. By excluding unused code, Native Image reduces the attack surface, and dynamic features like reflection require explicit allow-lists, enhancing control. Alina also noted compatibility with modern Java frameworks like Spring Boot, Micronaut, and Quarkus, which integrate Native Image support, and a community-maintained list of compatible libraries on the GraalVM website, ensuring broad ecosystem support.
Building and Testing GraalVM Applications
Alina provided a practical guide for building and testing GraalVM applications. Using a Spring Boot demo, she showcased the Native Maven plugin, which streamlines compilation. The build process, while resource-intensive for large applications, typically stays within 2GB of memory for smaller apps, making it viable on CI/CD systems like GitHub Actions. She recommended developing and testing on the JVM, compiling to Native Image only when adding dependencies or in CI/CD pipelines, to balance efficiency and validation.
Dynamic features like reflection pose challenges, but Alina outlined solutions: predictable reflection works out-of-the-box, while complex cases may require JSON configuration files, often provided by frameworks or libraries like H2. A centralized GitHub repository hosts configs for popular libraries, and a tracing agent can generate configs automatically by running the app on the JVM. Testing support is robust, with JUnit and framework-specific tools like Micronaut’s test resources enabling integration tests in Native mode, often leveraging Testcontainers.
Optimizing and Future Directions
To achieve peak performance, Alina recommended profile-guided optimizations (PGO), where an instrumented executable collects runtime profiles to inform a final build, combining AOT’s predictability with JVM-like insights. A built-in ML model predicts profiles for simpler scenarios, offering 6–8% performance gains. Other optimizations include using the G1 garbage collector, enabling machine-specific flags, or building static images for minimal container sizes with distroless images.
Looking ahead, Alina highlighted two ambitious GraalVM projects: Layered Native Images, which pre-compile base images (e.g., JDK or Spring) to reduce build times and resource usage, and GraalOS, a platform for deploying native images without containers, eliminating container overhead. Demos of a LangChain for Java app and a GitHub crawler using Java 22 features showcased GraalVM’s versatility, running seamlessly as native executables. Alina’s session underscored GraalVM’s transformative potential, urging developers to explore its capabilities for modern Java applications.
Links:
Hashtags: #GraalVM #NativeImage #Java #AOT #AlinaYurenko #DevoxxFR2024