Presenters

Source

Java’s Evolution: From Language Specs to Runtime Performance 🚀

Hey everyone, and welcome back to the blog! Today, we’re diving deep into the ever-evolving world of Java, a language that continues to surprise and delight us with its constant innovation. We had a fantastic chat with Simon Ritter, Deputy CTO at Azul and a long-time member of the JDK Expert Group (since JDK 9!), to unpack the latest trends and what’s on the horizon.

Get ready to explore everything from the intricacies of Java development to the cutting-edge advancements in runtime performance and language features! 💡

The Evolving Landscape of Java Development 🛠️

Remember the old days of Java development? Things have certainly changed! Simon shared some fascinating insights into how the Java Development Kit (JDK) is shaped today.

  • From JCP to JEPs: The development process has shifted significantly. While the Java Community Process (JCP) and Java Specification Requests (JSRs) were once the primary drivers, the focus has moved to the OpenJDK project. This open-source environment is where the magic of individual features, now known as JDK Enhancement Proposals (JEPs), truly happens.
  • Expert Group’s Role: The Expert Group’s role has evolved too. Now, it’s less about creating the core specifications from scratch and more about evaluating incoming JEPs to ensure they align with the Java SE specification. This makes their time commitment surprisingly manageable!
  • The Six-Month Release Cadence: The move to a six-month release cycle, starting around JDK 10, has been a game-changer. It means no more delays for features that aren’t quite ready. If something needs a bit more polish, like the Generational Shenandoah garbage collector for JDK 21, it can simply be pulled and slotted into the next release. This predictable cadence prevents the multi-year waits of the past.
  • Preview and Incubator Features: This faster release cycle also enables the powerful use of preview features and incubator modules. This allows new functionalities to be introduced, gather crucial feedback from developers, and be refined or even removed before becoming permanently cemented in the specification. A prime example is the String Templates feature, which was previewed, extensively discussed, and eventually removed to allow for a complete rethink. This flexibility is a massive win for ensuring Java’s continued relevance.

Supercharging Observability with Java Flight Recorder (JFR) 📡

Observability is key to understanding and optimizing our applications, and Java Flight Recorder (JFR) is at the forefront of this effort.

  • Beyond the Specification: Interestingly, JFR isn’t part of the core Java SE specification. It’s a feature developed and refined within the JDK itself.
  • The Observability Drive: The primary goal is to enhance application observability, giving developers deeper insights into what’s happening under the hood. This is crucial for identifying performance bottlenecks and unexpected behaviors.
  • Bridging JVM Data and Developer Insight: JFR works by exposing the wealth of information the JVM already collects (like class loading and heap activity) in a usable format. This data, when viewed through tools like Mission Control, provides a graphical representation that helps pinpoint issues, such as excessive object allocation leading to garbage collection problems.
  • Heisenberg’s Uncertainty Principle in Code: A fascinating parallel was drawn between Heisenberg’s Uncertainty Principle in physics and method tracing in software. The act of observing can alter the behavior you’re trying to measure. Excessive method tracing can skew application performance, making it difficult to get accurate insights.
  • Selective Tracing with JEPs: This is where new JEPs like JFR method timing and tracing come into play. They allow developers to be selective about what they trace, focusing only on specific methods of interest. This targeted approach minimizes the performance overhead and provides more accurate, focused data.

Project Leiden: Accelerating Java’s Startup and Performance ⚡

For those in the microservices world or dealing with frequent application restarts, the “warm-up time” of Java applications can be a significant challenge. Project Leiden is tackling this head-on.

  • The “Write Once, Run Anywhere” Trade-off: Java’s iconic “write once, run anywhere” philosophy relies on bytecode, which is then interpreted or compiled just-in-time (JIT). While JIT compilation offers excellent performance over time, it requires an initial “warm-up” period to identify and optimize frequently used methods.
  • GraalVM vs. Project Leiden: GraalVM offered a solution by compiling Java directly to native code, leading to instant startup. However, this approach often entails a “closed world,” limiting dynamic class loading and reflection. Project Leiden offers a more balanced approach.
  • Leiden’s Smart Approach: Instead of compiling everything ahead of time, Leiden focuses on minimizing the effort to reach peak performance. This involves:
    • Ahead-of-Time (AOT) Class Loading: Knowing which classes need to be loaded and initialized upfront, eliminating that discovery phase.
    • Caching Compiled Code: After an application has warmed up, a profile can be generated and used for subsequent runs. This allows for reusing compiled code, especially on the same machine, leading to significantly faster performance from the get-go.
    • Retaining Java’s Dynamism: Crucially, Leiden aims to preserve Java’s dynamic capabilities like reflection and dynamic class loading, avoiding the “closed world” limitations of some AOT solutions.

CRaC: The Power of Coordinated Restore at Checkpoint 💾

Complementing Project Leiden’s focus on startup optimization is CRaC (Coordinated Restore at Checkpoint), a technology that allows for taking and restoring snapshots of a running application.

  • Beyond Initial Startup: While Leiden speeds up the initial warm-up, CRaC addresses the application’s state after it’s running. It’s like suspending your laptop and reopening it exactly where you left off, but for your Java applications.
  • Controlled Restarts: CRaC enables a controlled restart of an application from a saved checkpoint. This is invaluable for scenarios requiring instant availability.
  • The “C” in CRaC is for Coordinate: The “coordinated” aspect is vital. CRaC ensures that external resources like database connections and open files are handled gracefully. Before a checkpoint, these resources can be gracefully shut down, and upon restoration, they can be re-established. This prevents stale connections and ensures a more robust restart.
  • Community Adoption: CRaC is already seeing adoption, with frameworks like Spring Boot and cloud providers like AWS (with SnapStart) leveraging its capabilities. Azul and Bellsoft have integrated CRaC into their OpenJDK distributions, and there’s ongoing work to make it a formal part of OpenJDK and potentially the JSR specification, though this involves significant effort, especially regarding cross-platform support.

What’s Brewing in JDK 26 and Beyond? 📈

The pace of innovation in Java is astounding! Here’s a glimpse of what’s on the horizon:

  • JDK 26 Sneak Peek: Expected in March, JDK 26 is already shaping up with exciting features:
    • Applets Finally Removed: A long-awaited cleanup, as applets are largely a relic of the past.
    • HTTP/3 Support: Modernizing network protocols for improved performance and efficiency.
    • G1GC Enhancements: Further improvements to the G1 garbage collector for better throughput.
  • Vector API: This API, focused on accelerating vector computations, is on its 10th iteration and is poised for broader adoption.
  • Project Amber and Pattern Matching: The ongoing work in Project Amber continues to impress. Pattern matching, in particular, is becoming a powerful set of features, making code more expressive and concise. We’re seeing its application across various areas, including handling primitive and reference types, which has required careful refinement through preview features.
  • Stable Values and Finality: The introduction of “stable values” in JDK 25, making final truly final even against deep reflection, is a significant step towards better JVM performance and security.
  • Compact Object Headers: This feature, which reduces heap memory usage, is a testament to the team’s focus on performance, even for internal JVM details that benefit developers directly.

The Future is Bright for Java! ✨

The transition to a six-month release cadence has clearly been a resounding success, enabling a controlled yet rapid evolution of the Java platform. Projects like Amber, Leiden, and Panama are pushing the boundaries of language syntax, runtime performance, and interoperability.

The Java community’s dedication to innovation, coupled with a pragmatic approach to introducing new features through preview and incubator mechanisms, ensures that Java remains a vibrant, powerful, and developer-friendly language for years to come.

What are your thoughts on these exciting Java developments? Let us know in the comments below! 👇

Appendix