/dq/media/media_files/2025/10/28/openais-new-ai-tool-1-2025-10-28-10-58-21.jpg)
Amazon Web Services (AWS) signed a seven-year, USD 38 billion agreement with OpenAI, positioning AWS as a key cloud provider for the artificial intelligence company. The partnership, announced on 3 November 2025, immediately gives OpenAI access to significant computing capacity for its growing AI workloads.
This agreement provides a major boost to AWS in its competition with rival cloud providers, including Microsoft Azure, which maintains a substantial investment in OpenAI. For OpenAI, the deal signifies a move to diversify its compute infrastructure, a necessity given the massive resources required for training increasingly sophisticated models.
Scale of compute resources
The USD 38 billion partnership provides OpenAI with computing resources that extend over seven years. OpenAI will use hundreds of thousands of NVIDIA Graphics Processing Units (GPUs). It also retains the option to scale to tens of millions of Central Processing Units (CPUs) to run processes where AI acts on its own, known as agentic workloads.
AWS plans to deploy all the initial capacity by the end of 2026, with room for further growth into 2027 and beyond. This deployment schedule aligns with the global demand for compute power from companies building the most advanced AI models.
Technical infrastructure and performance
AWS will build custom infrastructure for the partnership. This system uses Amazon EC2 UltraServers, which cluster the latest NVIDIA GB200 and GB300 GPUs. The hardware is connected using a unified, low-latency network. This design ensures fast communication between the connected chips, allowing for efficient use of the hardware.
The infrastructure will support various operations: running the live AI model for services like ChatGPT and training OpenAI’s next wave of foundation models. AWS has prior experience running large AI infrastructure clusters, some previously totaling more than 500,000 chips.
Building on existing collaboration
The new agreement formalizes and expands an existing technical relationship. Earlier this year, OpenAI made its open-weight foundation models available on Amazon Bedrock, AWS's service for building generative AI applications.
Number of AWS customers currently use these OpenAI models on the Bedrock platform for a variety of tasks. For example, customers such as Peloton, Comscore and more use the models for automated workflows, coding support, scientific analysis, and mathematical problem-solving. This existing user adoption base supported the decision for a broader infrastructure alliance.
/dq/media/agency_attachments/UPxQAOdkwhCk8EYzqyvs.png)
 Follow Us