It may seem cute that so many individuals treat chatbots with respect, yet the irony is that the courtesy we sprinkle on our prompts comes with a price in the real world. As AI becomes more ingrained in our daily lives, it may be time to re-examine small habits concerning politeness.
Truth be told, being nice to ChatGPT could be costly, and not by small change. OpenAI CEO Sam Altman admits that for inferences made in saying “please” and “thank you” to the AI assistants, the company spends tens of millions of dollars. He mentioned this in reference to a post on X, discussing the energy costs of unnecessary politeness. Altman said,
“Tens of millions of dollars well spent”. “You never know,”
AI Etiquette
While Altman shrugs off the cost, some in the tech world argue that civility does have its uses, even when machines are concerned. According to Microsoft design lead Kurtis Beavers, polite language produces corresponding respectful and constructive responses. He noted,
“Proper etiquette helps generate respectful, collaborative outputs. Using polite language sets a tone for the response”
This was echoed in a Microsoft WorkLab memo stating,
“When it clocks politeness, it’s more likely to be polite back. Generative AI also mirrors the levels of professionalism, clarity, and detail in the prompts you provide.”
In other words, give and take apply even if your chatbot cannot feel human.
Environmental Impact behind a Simple Prompt
In the shadow of this courteous digital charm, a real-world cost lies. A study by the One Washington Post investigation in collaboration with University of California revealed that the generation of an AI email of 100-words would require 14 kilowatt-hours worth of energy, or enough to power 14 LED lights for an hour. If you were to send one AI email a week for the entire year, an astonishing 7.5kWh would have been consumed, roughly equal to an hour of electricity usage for nine households in Washington DC. Multiply that by millions of daily prompts, and you have a gloomy picture of energy drain. Data centers that suppory generative AI consume almost 2% of the world’s electricity, and that number is only growing.
A Worthy Tradeoff
A survey conducted in 2024 discovered that out of all American AI users, 67% expressed politeness to their bots, 55% of users said they do it “because it’s the right thing to do,” while 12% do so to appease the algorithm in case of an AI revolt. The AI revolution is probably a long way off, if it happens at all, many AI researchers doubt we’ll ever build a truly “intelligent” algorithm, at least based on the current tech of large language models (LLMs) but the environmental consequences of today’s AI are all too real.
Nevertheless, as they concern themselves with the growing carbon footprint of AI, it may be time to reconsider kindness, which seems ironic. Although, a little less civility might translate into more sustainability. Let’s not be rude, but if saving the planet means skipping the occasional ‘thank you’ to your chatbot, it may be a tradeoff worth making.
Otakar
"The generation of an AI email of 100-words would require 14 kilowatt-hours worth of energy, or enough to power 14 LED lights for an hour. If you were to send one AI email a week for the entire year, an astonishing 7.5kWh would have been consumed, roughly equal to an hour of electricity usage for nine households in Washington DC."
There has to be some typo in the calculation
Point 1 – I don't think that 1 normal LED light consumes 1 kW of energy during one hour (14 LED lights requiring 14 kW for an hour).
Point 2 – expected 14 kW per hour *52 weeks is not equal to 7,5 kWh in total (it would be approx. 730 kWh in this case).