OpenAI’s latest creation, GPT-5, has been celebrated for its advanced capabilities, but a growing number of experts are drawing attention to what isn’t being celebrated: its massive energy consumption. Despite the company’s silence on the matter, researchers are working to quantify the model’s resource use, revealing a startling increase in power demand. These findings suggest that the quest for more powerful AI is driving a hidden and potentially unsustainable environmental cost, raising a red flag for the future of the technology.
The scale of the problem is becoming clearer thanks to independent analysis. The University of Rhode Island’s AI lab has benchmarked GPT-5, finding that a medium-length response consumes an average of 18 watt-hours. This is a substantial leap from past models and is “significantly more energy than GPT-4o,” according to a researcher in the group. To put this in a daily context, with ChatGPT handling billions of requests, this level of energy use could equate to the daily electricity needs of 1.5 million American households. The sheer magnitude of this consumption is a stark reminder of the physical resources required to run today’s most advanced AI.
The primary driver of this increased energy use is the model’s size. Without an official parameter count from OpenAI, experts must rely on industry trends and other disclosures. A recent study by French AI firm Mistral established a clear correlation between a model’s size and its energy consumption, noting that a model ten times bigger would have an impact one order of magnitude larger. Based on this, and the belief that GPT-5 is significantly larger than its predecessors, experts predict its resource footprint is “orders of magnitude higher” than even models from just a few years ago.
The architectural design of GPT-5 also plays a significant role. While its “mixture-of-experts” system offers some efficiency, its new features, particularly the reasoning mode and multimodal capabilities (handling video and images), likely offset these gains. A researcher noted that using the reasoning mode could cause the model to consume “several times higher, five to 10” more resources to get the same answer. This suggests that as AI becomes more sophisticated and capable of complex tasks, its energy demands will continue to climb, making the need for transparency and efficiency measures more critical than ever.