The ChatGPT politeness cost is surprisingly high. OpenAI is spending millions each month because users say “please” and “thank you” to ChatGPT, increasing the number of tokens processed and doubling responses. This politeness-driven interaction load highlights the unintended financial and environmental impact of user behavior. As discussions around AI usage grow, the ChatGPT politeness cost offers a quirky but real look into how simple words can have big consequences.

The mechanism behind this charming expense lies in how AI systems like ChatGPT work. Every interaction you have with it gets broken down into “tokens”—tiny pieces of text the AI has to process. When you add polite phrases to your messages, those extra tokens require more computing power. Even more significant is the AI’s tendency to reply to gratitude. Say “thank you,” and ChatGPT will almost always respond with “You’re welcome” or something equally well-mannered. That means one more processed query, one more computation cycle, and more energy used.

With hundreds of millions of users engaging daily, even a slight bump in query volume leads to a serious increase in operating costs. Estimates suggest the extra politeness could be adding around 8% more queries across the platform—translating into an additional $40–50 million in expenses each month. This includes the power needed to run high-performance servers and the water used to cool them. It’s an oddly modern problem: kindness is making our data centers sweat.

 

 

OpenAI CEO Sam Altman has commented on the phenomenon, confirming the impact and brushing it off with a kind of philosophical shrug. “Tens of millions of dollars well spent,” he said in a social media reply earlier this year. His response suggests OpenAI isn’t planning to discourage politeness—despite its unexpected cost—perhaps because it’s good for user experience, or because it says something hopeful about human behavior in the digital age.

Still, the environmental footprint can’t be ignored. Each AI query requires energy, and cooling those massive data centers requires a significant amount of water. Research from the Allen Institute for AI and others shows that even a single interaction with a large language model can use enough electricity to power multiple LED bulbs for an hour. Multiply that by billions of queries, and you start to get a sense of just how much juice all those “pleases” and “thank yous” require.

This unexpected cost is especially ironic given that OpenAI, like many tech companies, is racing to improve efficiency and reduce environmental impact. GPT-4o, for instance, consumes less energy per query than earlier versions, but the volume of use continues to rise—thanks in part to a surge in user politeness. According to usage statistics from May 2025, over 1 billion queries are made daily, from about 122 million users. That’s a lot of digital chitchat—and a surprising amount of it is cordial.

From a financial standpoint, it comes at a time when OpenAI is facing rising operational expenses. In 2024, the company reported projected losses of $5 billion on $3.7 billion in revenue. With AI’s popularity continuing to soar, balancing cost, performance, and accessibility is an ongoing challenge. Yet, despite the financial pressures, OpenAI appears willing to absorb the costs of niceties.

So what’s the takeaway here? If you’ve ever felt silly thanking an AI, don’t. Apparently, you’re in good company—and your manners are making a noticeable impact. Whether it’s a cost or an investment depends on your perspective. For now, OpenAI seems to think it’s worth it. After all, in a world where online interaction can often turn toxic, an extra “thank you” here and there may be a small price to pay for keeping the internet just a bit more civil—even if it’s not as free as we thought.

 

 

Floating Vimeo Video