/dq/media/media_files/2025/08/08/oracle-wei-and-steve-2025-08-08-07-18-44.jpg)
Wei Hu, Senior Vice President, High Availability and Emerging Technologies, and Steve Zivanic, GVP, Database and Autonomous Services, Oracle.
Oracle yesterday announced the general availability of its Globally Distributed Exadata Database on Exascale Infrastructure, designed to simplify the deployment of distributed mission-critical applications across Oracle Cloud Infrastructure (OCI) regions worldwide. This new service automatically distributes, stores, and synchronizes data across multiple locations, enabling applications to remain online even during regional outages and helping businesses meet data residency requirements.
This move from Oracle comes at a time where hyperscalers battle it out for more differentiation, manageability, and cost efficiency. The cloud as we know it till now is also on the threshold of a radical innovation, with AI and data sovereignty taking center stage.
If we see how Oracle is playing out in this backdrop, it is trying to rewrite the playbook for enterprise data infrastructure. At the heart of this shift lies its Globally Distributed Exadata Database on Exascale Infrastructure.
Offering an exclusive sneak peek into the platform, Wei Hu, Senior Vice President, High Availability and Emerging Technologies, Oracle, and Steve Zivanic, Group Vice President, Database and Autonomous Services, Oracle, spoke to Dataquest about the vision and capabilities behind Oracle’s new distributed database architecture.
Setting the context and tone, Wei Hu says, “It’s a platform that blends the robustness of distributed databases with the elastic power of ‘Exascale’ computing. So that’s a long name, so I call it short-form—Oracle Distributed Database on Exascale. The two key parts are really the Distributed Database and Exascale. And they work wonderfully together.”
Oracle: Building an ‘Always-On’, serverless foundation
Why is Oracle upping the ante in exascale—built for speed, scale, and power? Wei explains that the goal is to create an always-on, autoscaling, serverless architecture that is not only powerful in functionality but also easy to use and cost-efficient.
Oracle’s journey with distributed databases began back in 2017, with the introduction of Oracle Database 12c Release 2 (12.2). “A distributed database is one where the data is stored in multiple locations. It can be different servers, data centers, regions, or even countries,” Wei explains. “We’ve been running lots of large mission-critical applications globally, such as mobile messaging, credit card fraud detection, payment processing, Internet-scale marketing, smart power meters—you name it.”
Oracle says that these use cases demand high resilience and ultra-low latency. Whether it’s supporting always-on databases for stock trading and online banking, or meeting data residency regulations in countries like India, the system is designed for flexibility.
A strategic approach to data sovereignty, without complexity
How is this panning out at ground zero? One real-world example involved a major U.S. bank with operations in India. “To comply with RBI requirements, they broke out a section of data—what we call a shard—and stored it in India. The rest of the data stayed in the U.S.,” Wei shares. “What’s interesting is that it’s still one single distributed database. The application sees no change. This eliminates the need to replicate hundreds of microservices, making it a massive cost-saver.”
This model is now being extended to other countries, highlighting how Oracle allows organizations to scale securely and cost-effectively across sovereign borders.
In another example, a credit card company runs active-active-active operations across multiple countries and data centers, using Oracle’s new graph replication for fast, three-second failover with zero data loss. “This system supports thousands of transactions per second in high-risk applications,” says Wei.
Agentic AI and the need for hyperscale
But perhaps the most exciting use case is the support for agentic AI workloads.
“These workloads are not driven by humans; they’re driven by machines. That means immense load on the system, because these agents don’t rest,” Wei explains. “You need parallelism, hyperscaling, and elastic infrastructure. The load comes in waves and you need to scale up and down, serverlessly, and also comply with residency requirements.”
This is where Oracle’s distributed system shines, according to Wei. It provides a globally consistent view while meeting compliance, latency, and scale requirements for next-generation AI systems.
Converged, not fragmented
Today we live in a multi-cloud environment, and carving out a distinction is the key to making the cut. Wei is quick to contrast Oracle’s approach with other cloud providers.
“If you look at the computer industry, a lot of vendors are promoting single-purpose databases. One for transactions, one for JSON (JavaScript Object Notation), another for AI vector search. So developers end up juggling many databases, moving data around—and the single source of truth gets lost.”
And clearly, Oracle’s counter to this is the converged database.
“A single database supports all data types, cutting across relational, JSON, vectors, graphs, images—and all workloads—transactions, analytics, AI,” Wei explains. “No need to copy data, no need for ETL. You bring analytics to the data, not the other way around.”
Wei draws a powerful analogy: “I used to carry a GPS, PDA, MP3 player, and a notepad. Now all of those are just features in your smartphone. I believe that’s what’s happening to special-purpose databases—they’ll all become features of the converged database. That’s the smartphone of your data.”
Bringing AI to your data
Stepping in, Steve Zivanic, Group Vice President, Database and Autonomous Services, Oracle, adds: “Think about it. Where does the world’s most critical business data live? In Oracle Database. We’re bringing AI, vector search, and more to that data. You don’t need to move it. Nobody ever said, ‘I can’t wait to move my data.’ With us, you don’t have to.”
This approach preserves consistency between AI and business workloads because they operate in the same system, underscores Steve.
Autonomous: Where scale meets simplicity
Interestingly, Oracle also leads in autonomous database technology, a term coined by Oracle. “The idea is to let software manage the system, what it means is it doesn’t get tired, doesn’t make mistakes,” Wei says. “The database tunes itself, patches itself, and even handles threat detection automatically.”
Steve adds, “Whether you're a small business or a large enterprise, you benefit. You don’t need to write indexes, don’t need a DBA team, and you can focus on your core projects. Even patching is fully automated.”
Oracle’s play in a changing cloud world
As conversations around cloud reinvention and cost efficiency escalate, suggesting a shift back to private and hybrid clouds, Oracle’s multi-model, multi-cloud approach is seen as a sweet spot.
“Our cloud is the most flexible,” Wei explains. “You can deploy Oracle tech on-prem, in our public cloud, or with Cloud@Customer where we bring the cloud to your data center. And we also run in Azure, AWS, and Google Cloud. It’s the same software, same hardware—you can move freely among them.”
This “universal cloud platform,” as Wei puts it, gives businesses the power to shift as their needs evolve. “And that freedom—that maximum choice—is something only Oracle can offer.”
One database to rule them all?
‘Oracle’s Distributed Database on Exascale Infrastructure’ is more than a product, it's a platform vision according to Oracle. It blends AI-readiness, sovereignty compliance, converged architecture, and autonomous operations into a singular, flexible offering.
As Wei Hu and Steve Zivanic make clear, this is Oracle’s answer to the fragmented, complex, cloud-confused landscape of today. It’s a change not just in infrastructure, but in mindset.
As Holger Mueller, vice president and principal analyst, Constellation Research observes, “ With this service from Oracle, CIOs can confidently deploy agentic AI and mission-critical applications globally and meet local data residency requirements.”