Polite Prompts Costly for AI: Sam Altman Says “Please” and “Thank You” Add Millions to OpenAI’s Energy Bills
San Francisco, CA — In a lighthearted but telling exchange on X (formerly Twitter), Sam Altman, CEO of OpenAI, revealed that even simple user pleasantries—like saying “please” and “thank you” to ChatGPT—are contributing to tens of millions of dollars in additional energy costs for the company.
This surprising insight underscores a deeper, rapidly escalating issue in the artificial intelligence industry: the mounting infrastructure and power demands of large language models (LLMs) as they scale to meet global usage.
AI Etiquette Comes with a Price
The comment was prompted by a user who asked Altman how much it costs OpenAI when users add polite phrases in their prompts. Altman’s response—“Tens of millions of dollars well spent”—went viral, blending humor with a serious undertone about the computational cost of seemingly trivial interactions.
Each ChatGPT query, even when padded with extra words or niceties, consumes significant resources. According to a Goldman Sachs estimate cited in industry reports, a single ChatGPT-4 query consumes around 2.9 watt-hours of electricity—nearly 10 times the energy of a standard Google search.
With ChatGPT now surpassing 150 million weekly active users and handling over a billion queries daily, the cumulative energy usage is staggering—approaching 2.9 million kilowatt-hours per day, comparable to powering hundreds of thousands of homes.
Image Generation and AI Overload
Altman’s comment follows a similar call for caution earlier this year during the viral Studio Ghibli-style image generation trend, which caused a surge in demand that reportedly strained OpenAI’s servers.
As AI becomes deeply embedded in daily workflows—from coding and writing to graphic design and scheduling—the conversation around AI sustainability is shifting from niche concern to industry imperative.
“We’re scaling AI models that require massive infrastructure—and the hidden cost isn’t just capital, it’s kilowatts,” said Emad Mostaque, CEO of Stability AI, in a separate interview on AI sustainability.
Energy, AI, and Environmental Implications
While OpenAI has not disclosed specific infrastructure figures, it’s known to operate on Microsoft Azure’s supercomputing backbone, including GPU-intensive clusters powered by NVIDIA’s A100 and H100 chips. These high-performance systems are notoriously energy-intensive, raising both operational costs and carbon footprint concerns.
With increasing scrutiny from environmental groups and policymakers, AI labs such as Google DeepMind, Anthropic, and Meta AI are also being urged to adopt more energy-efficient practices and publish transparent energy usage data.
Some industry voices have floated ideas to reduce AI’s energy footprint, such as:
- Client-side response caching for basic replies like “You’re welcome”
- Model pruning for low-risk, repetitive responses
- Prompt optimization education for users
Even small efficiencies across billions of interactions could equate to meaningful savings—financially and environmentally.
A Future of More Responsible AI Use
As LLMs continue to power a growing share of human-computer interactions, the tech industry faces mounting pressure to optimize AI both in performance and efficiency. Polite prompts may seem harmless, but they illustrate a broader point: every word, every token processed by AI comes with a real-world cost.
Leave a comment