Native and Performance
GraalVM, startup optimization, memory footprint, AOT compilation
Native and Performance
Spring performance work is a series of deliberate trade-offs between startup time, memory footprint, throughput, operational simplicity, and framework dynamism.
1. Definíció / Definition
Mi ez? / What is it?
Native and performance topics in Spring cover how applications start faster, consume less memory, and behave more predictably under load. GraalVM Native Image and Spring AOT are important parts of that story, but so are classic JVM tuning, auto-configuration hygiene, and modern concurrency features such as virtual threads.
Miért létezik? / Why does it exist?
Because not every production workload optimizes for the same thing. A serverless function may care deeply about cold start. A high-density Kubernetes deployment may care about memory footprint. A long-running backend may care more about stable latency and peak throughput than startup speed.
Hol helyezkedik el? / Where does it fit?
This topic spans build-time decisions, runtime behavior, and architectural style. It affects how Spring creates beans, how much reflection the application relies on, how garbage collection behaves, and how concurrency is modeled.
2. Alapfogalmak / Core Concepts
2.1 JIT versus AOT
The traditional JVM model uses JIT compilation. Applications start from bytecode, and the runtime optimizes hot paths dynamically.
Native Image uses AOT compilation to produce a native executable ahead of runtime.
| Model | Strength | Weakness |
|---|---|---|
| JIT | strong peak optimization, flexible runtime | slower startup, larger memory |
| AOT / Native | fast startup, smaller footprint | reduced dynamism, harder build constraints |
This is the central performance trade-off: startup and footprint versus runtime flexibility and sometimes peak throughput.
2.2 Spring AOT in Boot 3+
Spring Boot 3 has built-in AOT support that analyzes application structure at build time, generates code, and reduces work otherwise performed reflectively at runtime. This makes Spring much more compatible with native compilation and often improves startup behavior even before going fully native.
2.3 Reflection, proxies, and serialization limits
Native compilation struggles with “discover everything later at runtime” patterns. Reflection, dynamic proxies, serialization, and resource loading all need to be known or hinted appropriately.
Historically that often meant reflect-config.json. In modern Spring applications, RuntimeHints is usually the preferred mechanism because it integrates better with framework AOT processing.
2.4 Startup time and memory footprint
Native images are attractive because they tend to provide:
- very fast startup;
- lower resident memory usage;
- improved density for many short-lived or burst-scaled instances.
This makes them compelling for autoscaling APIs, command-line tools, and serverless-style execution environments.
2.5 JVM tuning still matters
Native is not the only performance story. Mature JVM tuning remains valuable:
- right-size heap boundaries;
- choose a GC aligned with latency goals, such as G1GC or ZGC;
- reduce unnecessary auto-configurations;
- consider lazy initialization selectively.
Often the biggest win is not switching runtime model, but removing waste.
2.6 Virtual threads
Project Loom and Spring Boot 3.2+ introduce practical use of virtual threads. They are not a replacement for native compilation; they solve a different problem. Virtual threads make blocking I/O concurrency cheaper from a programming-model perspective and can reduce the need for more complex asynchronous designs.
3. Gyakorlati használat / Practical Usage
Native images shine in workloads where cold start and memory density dominate. A small HTTP service that scales up and down aggressively can benefit significantly because instances become useful faster and consume less memory per pod. That changes operational behavior, not just benchmark numbers.
For long-running services with sustained throughput, the JVM may remain the better choice. JIT can aggressively optimize hot code paths over time, and ecosystem compatibility is broader. If the application is stable, well-sized, and not startup-sensitive, the operational cost of native builds may not justify the switch.
Many Spring applications can improve startup without going native. Excluding unused auto-configurations, removing unnecessary starters, simplifying bean graphs, and reducing reflection-heavy patterns can materially lower startup time and memory footprint while keeping the familiar JVM runtime.
Virtual threads are particularly compelling for services with high concurrency and mostly blocking I/O, such as HTTP fan-out or JDBC-heavy orchestration. They let teams retain an imperative programming model while scaling concurrency more elegantly. Still, virtual threads do not remove downstream capacity limits, timeouts, or the need for bulkheads.
4. Kód példák / Code Examples
4.1 Runtime hints for native compatibility
@Configuration
@ImportRuntimeHints(OrderRuntimeHints.class)
public class NativeConfiguration {
}
class OrderRuntimeHints implements RuntimeHintsRegistrar {
@Override
public void registerHints(RuntimeHints hints, ClassLoader classLoader) {
hints.reflection().registerType(OrderDto.class,
MemberCategory.INVOKE_DECLARED_CONSTRUCTORS,
MemberCategory.INVOKE_PUBLIC_METHODS,
MemberCategory.DECLARED_FIELDS);
}
}
record OrderDto(Long id, String status) {}
4.2 Startup-oriented configuration
spring:
main:
lazy-initialization: true
autoconfigure:
exclude:
- org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
This is useful only when those exclusions and lazy semantics match the application’s actual needs.
4.3 Virtual-thread executor integration
@Configuration
public class VirtualThreadExecutorConfig {
@Bean
public AsyncTaskExecutor applicationTaskExecutor() {
return new TaskExecutorAdapter(Executors.newVirtualThreadPerTaskExecutor());
}
}
4.4 Maven plugin for native image builds
<build>
<plugins>
<plugin>
<groupId>org.graalvm.buildtools</groupId>
<artifactId>native-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
5. Trade-offok / Trade-offs
Native advantages
- excellent startup time;
- lower memory footprint;
- strong fit for container density and burst scaling.
Native disadvantages
- longer and more complex builds;
- reflection and proxy constraints;
- uneven library compatibility;
- throughput is not automatically better and may be worse for some workloads.
JVM tuning advantages
- lower migration risk;
- better library compatibility;
- mature operational model and strong long-run throughput.
Virtual-thread trade-off
- simpler concurrency for blocking I/O;
- but no magic solution for CPU saturation, connection pools, or downstream rate limits.
6. Gyakori hibák / Common Mistakes
6.1 Treating native as a universal acceleration switch
Native improves some dimensions dramatically, especially startup and memory, but it is not a free win across all performance axes.
6.2 Keeping reflection-heavy design unchanged
Applications built around heavy reflection, dynamic scanning, and runtime discovery patterns are harder to migrate. Native adoption often rewards simpler, more explicit design.
6.3 Enabling lazy initialization blindly
Lazy initialization can reduce startup time while moving cost to the first request. That may be acceptable for rarely used paths and disastrous for critical request flows.
6.4 Tuning GC without workload measurement
Choosing G1GC or ZGC should be based on heap behavior, pause goals, and allocation rate, not fashion. Performance tuning without measurement is configuration theatre.
6.5 Over-romanticizing virtual threads
Virtual threads simplify concurrency, but they do not remove database bottlenecks, HTTP timeouts, lock contention, or poor query design. They improve one layer of the problem, not the entire system.
7. Senior szintű meglátások / Senior-level Insights
Senior performance work is workload-specific. The right answer depends on how the system is used, not on what is newest. For bursty, short-lived workloads, native can be transformational. For stable, compute-heavy, long-running services, the JVM may remain the better strategic choice.
Startup optimization is often more about application hygiene than runtime replacement. Removing unnecessary starters, constraining auto-configuration, simplifying bean creation, and reducing reflective behavior can produce meaningful gains without introducing native build complexity.
Compatibility risk deserves real attention. A library that behaves perfectly on HotSpot can fail in native mode because of serialization assumptions, dynamic proxies, or classpath resource handling. Native adoption therefore belongs in architecture and platform planning, not just in a developer experiment.
Virtual threads should be treated as a powerful concurrency tool, not an excuse to ignore systems thinking. You still need connection pool sizing, timeouts, monitoring, backpressure, and downstream protection. Easier concurrency does not remove operational responsibility.
8. Szószedet / Glossary
- GraalVM Native Image: AOT technology that produces native executables.
- AOT: ahead-of-time compilation.
- JIT: just-in-time runtime optimization.
- Spring AOT: Spring’s build-time optimization and hint-generation support.
- RuntimeHints: Spring API for native metadata registration.
- Reflection: runtime inspection or invocation of types and members.
- GC: garbage collector.
- G1GC: general-purpose HotSpot collector.
- ZGC: low-pause collector optimized for latency-sensitive workloads.
- Virtual thread: lightweight thread abstraction introduced by Loom.
9. Gyorsreferencia / Cheatsheet
| Topic | Best fit | Watch out for |
|---|---|---|
| Native Image | cold-start and memory-sensitive services | build time and compatibility |
| JIT JVM | long-running throughput-heavy services | startup and footprint |
| Spring AOT | native readiness and startup work reduction | dynamic design patterns |
| RuntimeHints | reflection metadata registration | register only what is needed |
| lazy init | startup reduction | first-request penalty |
| auto-config exclude | leaner app boot | excluding needed modules |
| G1GC | strong default in many systems | measure before tuning |
| ZGC | low-latency goals | version and memory considerations |
| virtual threads | high-concurrency blocking I/O | downstream limits still apply |
🎮 Games
8 questions