Private data centres repeat 100-year-old mistake: Jeff Bezos

Jeff Bezos warns that private AI data centers mimic early 20th-century factory power plants. He argues that centralised cloud grids offer better efficiency than private hardware, which often sits idle and wastes energy.

author-image
DQINDIA Online
New Update
Jeff Bezos
Listen to this article
0.75x1x1.5x
00:00/ 00:00

Amazon founder Jeff Bezos recently criticised the growing trend of technology companies building private data centres to support artificial intelligence. Speaking at the 2026 New York Times DealBook Summit, Bezos argued that firms investing in their own massive computing facilities are repeating a historical error from the early industrial era.

The electricity analogy

Bezos compared the current AI infrastructure rush to how factories operated at the start of the 20th century. Before public utility grids existed, businesses had to build and maintain their own power plants to run their machinery.

He recalled a visit to a 300-year-old brewery in Luxembourg that once housed its own electric generator. Once centralised power grids became available, the brewery’s private plant became an expensive burden. Bezos stated that the tech industry is currently in a similar "generator phase."

"Right now, everybody is building their own data centre, their own generators essentially. And that's not going to last," Bezos said during the summit.

Efficiency and utilisation

The primary issue with private infrastructure is utilization. Bezos noted that individual companies often build capacity for their peak needs, leaving expensive hardware idle during quieter periods. He likened this to owning a Boeing 747 and leaving it on the runway most of the time.

In contrast, centralised cloud providers like Amazon Web Services (AWS) serve millions of customers. This allows them to shift resources where they are needed in real-time. This model reduces "muck", the administrative and technical labour required to manage hardware, allowing businesses to focus on their core products.

The growing energy constraint

The energy requirements for AI are rising. Data centres consumed approximately 415 terawatt-hours of electricity globally in 2024. Projections suggest this will reach 945 terawatt-hours by 2030, a figure nearly equal to the total power consumption of Japan.

Specific AI tasks demand significant resources:

  • A single query to a large language model uses nearly 10 times the energy of a standard web search.

  • Training one large AI model can consume as much electricity annually as 200 average US households.

Bezos identified energy availability as the ultimate bottleneck for AI growth. He suggested that as power becomes scarcer and more expensive, the efficiency of centralised cloud "grids" will become a necessity rather than a choice.

Future solutions in orbit

Looking further ahead, Bezos discussed moving the most energy-intensive computing off the planet entirely. During Italian Tech Week in Turin, he predicted that gigawatt-scale data centres will operate in space within the next 20 years.

Orbiting facilities would have access to 24/7 solar power without interference from clouds, rain, or night cycles. Bezos frames this as the next logical step in using space to support Earth, following the path of weather and communication satellites.

Advertisment