
Saying ‘thank you’, ‘please’ to ChatGPT is costing OpenAI ‘millions of dollars’
In a recent revelation, OpenAI CEO Sam Altman has shed light on a peculiar issue that has been costing the company a significant amount of money. According to Altman, using phrases like ‘please’ and ‘thank you’ with ChatGPT, their popular AI model, is costing OpenAI “tens of millions of dollars”. This astonishing statement has sent shockwaves across the tech community, leaving many wondering how such a seemingly innocuous activity could have such a significant financial impact.
The response was made in response to a user’s question on how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models. Altman’s tongue-in-cheek remark, “Tens of millions of dollars well spent,” suggests that the company is not only aware of the issue but also finds it amusing. When pressed for further clarification, Altman added, “You never know,” implying that the true extent of the costs is likely to be even higher.
So, what’s behind this surprising revelation? To understand the issue, it’s essential to grasp the underlying technology behind ChatGPT. ChatGPT is a large language model trained on massive amounts of text data, allowing it to generate human-like responses. However, this complex process requires significant computational resources, which in turn, consume a substantial amount of electricity.
When users interact with ChatGPT, they often use phrases like ‘please’ and ‘thank you’, which may seem harmless but can have a significant impact on the model’s performance. These phrases require the model to process additional data, which in turn, increases the computational load on the servers. This increased load results in higher electricity costs, which OpenAI is footing the bill for.
The scale of the issue becomes apparent when considering the sheer volume of interactions ChatGPT receives daily. With millions of users interacting with the model, even a small increase in computational load can add up quickly. It’s estimated that ChatGPT handles over 10 million requests per day, making it one of the most popular AI models in the world.
While the costs associated with saying ‘please’ and ‘thank you’ may seem trivial at first glance, it’s essential to consider the broader implications. OpenAI’s decision to absorb these costs highlights the company’s commitment to providing a seamless user experience, even if it means sacrificing some of their own profits.
In an era where AI is rapidly transforming industries and revolutionizing the way we live and work, the costs associated with using ChatGPT are a sobering reminder of the significant resources required to drive innovation forward. As AI continues to play an increasingly important role in our lives, it’s crucial that we recognize the value of the computational resources required to power these models.
In conclusion, Sam Altman’s remark highlights the remarkable scale and complexity of OpenAI’s operations. The fact that phrases like ‘please’ and ‘thank you’ are costing the company “tens of millions of dollars” serves as a reminder of the significant resources required to power AI models like ChatGPT. While the costs may seem high, they are a small price to pay for the innovative solutions and services that AI models like ChatGPT provide.