Azul’s Gil Tene on Java’s evolution, AI pressure and why efficiency will define the future

Java’s next era is shaped by cloud costs, AI-driven load and the need for deterministic performance. Gil Tene explains why efficiency will define the future of the JVM.

author-image
Shrikanth G
New Update
Gil-Tene-Azul

Gil Tene, Co-founder and Chief Technology Officer, Azul

Listen to this article
0.75x1x1.5x
00:00/ 00:00

There are technologists who follow industry shifts and there are those who have lived through their defining moments. Gil Tene belongs to the latter group. As Co-founder and Chief Technology Officer of Azul, and one of the world’s most respected Java Virtual Machine performance engineers, he has experienced Java’s evolution from its earliest days. He attended the very first JavaOne, watched Netscape announce Java in the browser, and has since seen Java transform from a bold experiment in the mid-1990s to the foundation of global enterprise computing.

Advertisment

Earlier this year, I interviewed Azul Chief Executive Officer Scott Sellers about the company’s Java-first mission. This follow-up conversation with Gil explores the engineering principles, community stewardship, cloud economics and AI-driven pressures shaping the future of Java. It is also a deeply India-relevant story. From banks, stock exchanges and payment systems to e-commerce giants and digital-native companies, India runs some of the world’s largest Java estates at extreme volume. As cloud spending rises and artificial intelligence amplifies backend load, efficiency is becoming a strategic imperative.

In this conversation with DATAQUEST, Gil reflects on Java’s evolution, the rise of OpenJDK, deterministic performance, the coming surge in AI-driven backend demand and what Azul is building for 2026 and beyond.

Java has been the backbone of enterprise systems for decades. From your perspective, what convinces you that Java remains pivotal as organisations move into a cloud-intensive, AI-driven era?

Advertisment

Java has dominated enterprise applications for almost three decades. It arrived in the mid-1990s and very quickly displaced the languages and frameworks used for business applications at the time. Before Java, companies relied on C, various fourth-generation languages and environments such as Tuxedo application servers. Java made much of that obsolete in only a few years.

Since then, whenever a business application needs to be built, Java is the most common choice. Other languages have grown in niche areas, but it is extremely rare to find an organisation with no Java footprint at all. By contrast, it is easy to find companies with no Ruby, no PHP, no Node or no Python in production. Even when those stacks were popular, it is now difficult to find developers who can maintain them.

Java’s continuity is one of its greatest strengths. You can hire a developer today who learned Java recently and they can work productively on a system built fifteen years ago. Very few ecosystems offer that combination of stability, backward compatibility and modern evolution. Over the years, Java has added functional programming idioms, lambdas, records, virtual threads and a simpler syntax. Yet all of this has happened without breaking existing software. That balance is why Java remains the foundation of enterprise computing.

When Oracle acquired Sun Microsystems in 2010, there was debate about Java’s future. Did innovation continue at the same pace and how has Java’s stewardship evolved through OpenJDK?

There was natural scepticism when Sun Microsystems was acquired by Oracle. But Java has done very well since then, mostly because it transitioned into a truly community-driven project through Open Java Development Kit, commonly known as OpenJDK. Oracle is a strong leader in OpenJDK. It contributes heavily and helps shape the direction of the platform. However, it is not the sole steward of Java’s future.

OpenJDK is built through a community process that includes contributions from many companies such as Azul, Amazon, Red Hat, IBM and SAP. All new features and versions are developed in the open. What gives the community real confidence is the maintenance model. Oracle engineers focus on building future versions and maintain a new release for only a short window after it ships. After that period, responsibility for ongoing maintenance, including security fixes and bug patches, shifts to the wider OpenJDK community.

If you are running an OpenJDK-based build of Java today, whether that is Java 8, Java 11, Java 17, Java 21 or any other supported version, most of the long-term maintenance is performed by this broader community rather than by Oracle alone. Java’s robustness does not depend on a single vendor. And because OpenJDK is licensed under the GNU General Public License with the Classpath Exception, it cannot be taken back or restricted. Even in a hypothetical situation where Oracle created features only for its own customers, the community-maintained OpenJDK line would continue and the industry could freely follow it. That openness is at the heart of Java’s resilience.

Cloud-native architectures have transformed how organisations think about uptime, scaling and availability. How has this shift changed the performance expectations placed on the Java Virtual Machine and Java applications?

A major shift occurred just before the COVID-19 pandemic and accelerated rapidly afterward. Organisations moved dramatically towards cloud-based computing and cloud-native styles of development. This includes the use of public hyperscale cloud providers such as Amazon Web Services, Google Cloud and Microsoft Azure. It also includes the architectural mindset that comes with them.

In traditional environments, you thought in terms of single application instances and clustered failover. In modern cloud environments, for example a Kubernetes managed cluster, the infrastructure itself handles availability. If you assume there is only one instance of your service, you are building it incorrectly. Availability is now an expectation, not an option.

Java fits naturally into this environment. The frameworks, tooling and ecosystem support microservices, distributed systems and modern DevOps practices. Where challenges remain is not in capability but in efficiency. You can achieve extremely high availability and performance in the cloud. But doing it in a cost efficient manner is difficult. This is where Azul focuses: improving consistency, eliminating warm-up penalties, enabling smooth elasticity and driving higher utilisation per central processing unit so organisations can do more with the same cloud budget.

Cloud cost pressure is at an all-time high. How does optimisation at the Java Virtual Machine layer translate into real cloud savings, particularly as artificial intelligence increases backend load dramatically?

Our cost advantage comes from delivering more performance for the same cloud dollar. Artificial intelligence is increasing the pressure on systems that are not doing AI themselves. Once AI driven experiences become mainstream, backend systems, including Java based transaction engines and data stores, must handle much higher volumes of requests.

Consider how travel systems experienced a surge in the look to book ratio. Two decades ago, a manageable number of queries were required to complete one booking. As robotic price scraping ( bots repeatedly checking prices at scale) became common, the number of queries grew by three orders of magnitude. With generative artificial intelligence, this will expand even more. AI can evaluate far more options than a human could, which means backend systems must process enormous amounts of additional traffic.

Java based systems powering these backends must increase capacity without increasing cost. Runtime optimisation is no longer only about speed. It is about efficiency and sustainability. Eliminating unnecessary CPU consumption, achieving consistently high utilisation, and handling elasticity without latency spikes allow organisations to invest cloud resources where they matter most, especially in artificial intelligence initiatives.

As artificial intelligence workloads expand, what changes do you expect in the Java ecosystem? How will technologies such as Project Loom, Project Panama, Project Valhalla and Checkpoint Restore At Checkpoint influence this future?

Java’s involvement with artificial intelligence has many layers. There are libraries that integrate with generative artificial intelligence platforms. But the largest impact is indirect. Roughly half of global compute consumption runs on Java based systems. As artificial intelligence increases pressure on those systems, Java workloads scale with them.

Project Loom is now available in Java 25, providing lightweight virtual threads that make asynchronous programming easier. Vector Application Programming Interfaces are emerging to support workloads that use vectorisation. Project Valhalla, which has been in development for many years, will eventually bring value types and more efficient memory layouts.

Azul leads a project within OpenJDK called Checkpoint Restore At Checkpoint, also known as CRaC. It allows a warm Java Virtual Machine to be snapshotted and restarted instantly without reinitialising every component. It is similar to closing and opening a laptop rather than rebooting. Frameworks such as Spring Boot, Quarkus and Micronaut are adopting this technology, which has major implications for cloud elasticity.

We are also developing fleet level optimisation technology through our Optimizer Hub. Instead of each Java Virtual Machine learning performance behaviour in isolation, a fleet of Java Virtual Machines running the same service can share optimisation decisions. This reduces the overall cost of optimisation and improves consistency. This type of collective intelligence will be important as cloud deployments become more mature.

India operates some of the world’s highest volume digital systems, from payments and stock exchanges to e-commerce events. How do you see the Indian market in terms of performance expectations, data protection and cloud sovereignty?

Azul has had an engineering presence in India for more than twenty years through our team in Bengaluru. Historically, that team supported our global products, but in recent years we have increased our focus on the Indian market. We are now working more closely with Indian customers and partners.

India’s scale is unique. The country has five times the population of the United States yet must serve its users at significantly lower cost. Efficiency is essential. High volume transaction systems in trading, payments and retail almost always rely on Java because it is the most efficient way to process large workloads.

We have seen strong adoption in India across stock exchanges, payment platforms and other large scale transactional environments. With new data protection rules, localisation requirements and sovereign cloud discussions taking shape, the need for efficient and cost conscious compute is only going to grow. Because Java is central to these workloads, optimising Java is one of the most impactful ways to meet these demands.

Looking ahead, what are your priorities for 2026 as Chief Technology Officer and Co-founder of Azul?

We recently announced a major investment from Thoma Bravo. This allows us to accelerate both growth and profitability. Demand for what we do is strong. The world is expanding rapidly and organisations need to run Java more efficiently.

On the technology side, we are focused on cloud native Java, new capabilities for Java Virtual Machines at scale and our Intelligence Cloud platform. Because we run Java for some of the largest organisations in the world, we have unique insight into how Java behaves in production. We want to use those insights to improve operational efficiency, engineering productivity and resource planning for our customers.

We will continue investing in community led innovation including Checkpoint Restore At Checkpoint, future Java releases and OpenJDK capabilities. Ultimately, my responsibility is to ensure that Azul’s Java Development Kit remains the best performing, most efficient, most stable and most deterministic option available. The ecosystem is open and competitive. If someone builds a better Java Virtual Machine, customers can move to it. My job is to ensure we stay ahead by continually innovating.

java