Home » The Unseen Impact: GPT-5’s Energy Use Sparks Calls for Transparency

The Unseen Impact: GPT-5’s Energy Use Sparks Calls for Transparency

by admin477351

OpenAI’s GPT-5 is being hailed as a major breakthrough, but its release is also highlighting a critical, and often hidden, issue: its immense energy consumption. With no official data from the company, independent researchers are providing the first real glimpse into the model’s power demands. Their findings suggest that the enhanced capabilities of GPT-5 come at a steep and unprecedented environmental cost, leading to urgent calls for greater transparency from the AI industry.
The numbers are alarming. Researchers at the University of Rhode Island’s AI lab have found that a medium-length response from GPT-5 consumes an average of 18 watt-hours. This is a substantial increase from previous models and is “significantly more energy than GPT-4o,” according to a researcher in the group. To put this in perspective, 18 watt-hours is enough to power an incandescent light bulb for 18 minutes. Given that ChatGPT handles billions of requests daily, the total energy consumption of GPT-5 could reach the daily electricity demand of 1.5 million US homes, a staggering figure that underscores the scale of the problem.
A key factor driving this dramatic increase is the model’s size. Although OpenAI has not released the parameter count for GPT-5, experts believe it is “several times larger than GPT-4.” This is consistent with a study from the French AI company Mistral, which found a “strong correlation” between a model’s size and its resource consumption. The study concluded that a model ten times bigger would have an impact one order of magnitude larger. This suggests that the trend of building ever-larger AI models will continue to drive up resource usage at an alarming rate.
The new capabilities of GPT-5 also play a significant role in its high energy demands. Its advanced “reasoning mode” and ability to process video and images require more intensive computation. A professor studying the resource footprint of AI models noted that using the reasoning mode could increase resource usage by a factor of “five to 10.” This means that while a “mixture-of-experts” architecture offers some efficiency, the new, more complex tasks are driving the overall energy footprint to new heights. The urgent calls for transparency from AI developers are a direct response to this growing environmental concern.

You may also like