/dq/media/media_files/2025/04/23/2rs7Wr8G07VUSo03COjo.jpg)
Sam Altman
Next time you thank ChatGPT for answering your question, you might also want to apologize to the electric grid. According to OpenAI CEO Sam Altman, those extra polite words — “please,” “thank you,” “kindly”— are quietly racking up millions of dollars in computing costs.
Good manners, it turns out, aren't free in the age of artificial intelligence.
In a world where we ask artificial intelligence for everything from grocery lists to career advice, one thing we often offer in return is basic courtesy. But those “pleases” and “thank yous”? They come with a price tag.
In a now-viral comment on X (formerly Twitter), OpenAI CEO Sam Altman confirmed a wild but oddly logical suspicion: the polite phrases users toss at ChatGPT are costing tens of millions of dollars in computing power. “Well spent,” Altman noted dryly — though one could hear the sigh behind the sentiment.
Wait, how does saying 'thank you' cost money?
Let’s break it down:
-
Chatbots process every word.
-
Whether you're asking “What's the capital of Norway?” or saying “Thank you so much,” the AI doesn’t discriminate — it processes both with the same compute-hungry machinery.
-
-
Each word triggers a response.
-
AI models like ChatGPT are built on what's called a Large Language Model (LLM). Think of it as an ultra-advanced autocomplete system that doesn’t just guess the next word — it generates full, context-aware sentences. More words in = more data = more computing.
-
-
Compute equals carbon (and cash).
-
Behind the scenes, massive data centers run 24/7, consuming huge amounts of electricity. Each prompt — yes, even a simple "please" — requires multiple calculations across thousands of graphics processing units (GPUs), each sipping power like there's no tomorrow.
-
-
Scale it up to millions of users.
-
That one polite word, multiplied by millions of users every day, adds up fast. Tens of thousands of extra words per minute means more server time, higher energy bills, and ultimately, more emissions.
-
A Washington Post and University of California study recently showed that generating a single 100-word AI email consumes 0.14 kilowatt-hours — enough to light 14 LED bulbs for an hour. Multiply that by the billions of words processed daily across platforms like ChatGPT, Copilot, Bard, and Grok, and the energy toll becomes startling.
But isn't politeness a good thing?
Interestingly, many designers of AI systems actually encourage polite prompts. Microsoft's Kurtis Beavers notes that "using polite language sets a tone for the response." Generative AI is known to mirror the style and tone of the user, meaning courteous prompts often yield more professional or friendly answers.
In fact, a late-2024 survey found that 67% of Americans are polite to their chatbots, with 12% admitting they do it to "appease the algorithm" in case of an eventual AI uprising. (No judgment.)
What’s the bigger issue here?
At its core, this conversation highlights a more pressing concern: AI isn’t free — environmentally, financially, or ethically.
AI models are estimated to consume about 2% of global electricity usage, and that number is only growing. From generating emails to spinning up deepfake videos, our everyday “magic” moments with AI are powered by giant, energy-hungry machines.
And while it’s easy to laugh off the cost of saying “thank you,” it forces a bigger question: Are we prepared for the invisible footprint of our digital habits?
TL;DR: Every word you type into ChatGPT burns a little electricity. Multiply that by millions of people asking millions of questions — and being extra polite — and you're suddenly talking big money and big energy. Politeness may be free in human terms, but when it comes to AI, it’s quietly draining our servers and our planet.
So next time you’re chatting with your favorite bot, maybe... just get to the point?