The environmental cost of a ChatGPT query, according to OpenAI's CEO

The environmental cost of a ChatGPT query, according to OpenAI's CEO

The Star Online - Tech·2025-06-18 11:01

What is the environmental impact of using large language models such as ChatGPT? It's difficult to say, although several studies on the subject have already been conducted. OpenAI founder Sam Altman has now provided a very precise estimate, but how does that stack up against other experts' calculations?

What's the environmental cost of a single query on ChatGPT? This question has been on many people's minds since the creation of OpenAI's famous AI chatbot and, more generally, the advent of large language models (LLMs). It's a question that Sam Altman, CEO of OpenAI (the company behind ChatGPT), recently answered in a post on his personal blog. "People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon," writes the CEO. He adds: "As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity."

However, Sam Altman's hypothesis does not address the increasingly widespread use of these rapidly expanding tools. When asked directly, ChatGPT itself points out that while a ChatGPT query may have a less significant environmental impact than most common digital uses, its footprint quickly accumulates with billions of daily uses. Multiplied by billions of daily interactions, the footprint can become significant, OpenAI's chatbot says.

4,300 return flights between Paris and New York

Two years ago, the Greenly app (which allows companies to assess their CO2 emissions in real time) estimated that the overall carbon footprint of the first version of ChatGPT could be around 240 tonnes of CO2e, equivalent to 136 round trips between Paris and New York City. The learning systems alone were estimated to account for 99% of total emissions, or 238 tCO2e per year. In detail, operating electricity accounts for three quarters of that footprint (ie, 160 tCO2e), followed by server manufacturing (68.9 tCO2e) and refrigerant gas leakage (9.6 tCO2e), the report says.

A more recent analysis also conducted by Greenly on the overall environmental cost of the new version of ChatGPT estimates that if an organization uses the tool to respond to one million emails per month, ChatGPT-4 would generate 7,138 tonnes of CO2e per year, divided between training and use of the model. This would be equivalent to 4,300 round-trip flights between Paris and New York.

US researchers at the prestigious Massachusetts Institute of Technology estimate that training several AI language models would be equivalent to five times the emissions of an average US car over its entire life cycle (including manufacturing).

The environmental cost of these rapidly expanding technologies is therefore a crucial issue. It is with this in mind that an emerging trend for smaller AI models that are more efficient, cheaper, and less energy-intensive is now gaining ground.  – AFP Relaxnews 

……

Read full article on The Star Online - Tech

Environment Entertainment Malaysia