OpenAI’s CEO, Sam Altman, revealed that the company’s energy costs are significantly impacted by users being polite to its ChatGPT chatbot, with the extra processing burden resulting in “tens of millions of dollars” in expenses.
The conversational AI has become so realistic that many users feel inclined to use polite language when interacting with it, not realizing the additional processing power required to handle phrases like “please” and “thank you.” As Altman noted, “You never know” the value of this politeness, but it is clear that it comes at a cost.
A survey conducted in the U.S. last year found that 67% of respondents reported being polite to AI chatbots, while 33% preferred to be more direct. This politeness may be affecting not just OpenAI’s energy costs but also the environment, as most data centers are still powered by electricity generated from fossil fuels.
Research has also shown that the level of politeness can impact the quality of the responses generated by large language models (LLMs) like ChatGPT. Impolite prompts may lead to a deterioration in model performance, resulting in responses containing mistakes, stronger biases, and omission of information.
A TechRadar reporter who experimented with being less courteous to ChatGPT found that the responses “seemed less helpful.” This raises questions about the potential consequences of being less polite to AI chatbots, not just for the companies operating them but also for human interactions.
If being blunt to AI chatbots becomes socially acceptable, it could potentially leech into interpersonal interactions, making human exchanges less courteous over time. As a result, users may need to weigh the benefits of being polite to AI against the potential environmental and financial costs.