
What happens when we say Thank You for AI-generated content? It turns out, many users are thanking generative AI after giving it instructions. However, if millions engage in this behavior, it may lead to unmanageable consequences. Substantial energy loss is incurred as expressions of gratitude travel back and forth through the internet networks and servers. Moreover, the interaction doesn’t end with the user’s message. The AI typically responds by saying that it is always happy to help, causing a significant increase in energy consumption due to the repeated data interactions.
Fuel for an Energy-Hungry Machine
Sam Altman, OpenAI’s CEO, recently voiced his concerns on X (formerly Twitter). When a user posted, “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models,” Altman responded, “Tens of millions of dollars well spent.” He emphasized that greater word counts in user requests or increased frequency of interactions result in higher server loads and power consumption. “You never know,” Altman noted.
Altman Warns of Unpredictable Consequences
Even without direct AI interaction, digital communication consumes power. According to The Washington Post, sending a 100-word email uses 0.14 kilowatt-hours (kWh) of electricity. A study by the University of California, Riverside found that generating a simple “You are welcome” with a large language model (LLM) uses about 40-50 milliliters of water. AI chatbot data centers already account for roughly 2% of global electricity use.
The AI power consumption issue that Altman highlighted is likely to worsen in the future as users tend to prefer polite language in AI interactions. In fact, a survey conducted by global media group Future PLC at the end of last year found that 67% of American respondents reported using polite language with chatbots.