The much-anticipated launch of OpenAI’s GPT-5 is a testament to technological progress, but it also brings a critical and sobering question to the forefront: its potentially massive energy consumption. As the company remains notably silent on the issue, experts are raising serious alarms. They argue that the model’s enhanced capabilities—such as its ability to create websites and answer PhD-level questions—come with a steep and unprecedented environmental cost. This lack of transparency from a major AI developer is sparking a crucial conversation about the industry’s commitment to sustainability.
A study from the University of Rhode Island’s AI lab provides a key piece of evidence for these concerns. Researchers found that generating a single medium-length response of about 1,000 tokens with GPT-5 can consume an average of 18 watt-hours. This represents a dramatic increase from previous models. To put this into perspective, 18 watt-hours is the amount of energy needed to power a traditional incandescent light bulb for 18 minutes. Given that a service like ChatGPT fields billions of requests daily, the aggregate consumption could be staggering, potentially reaching the daily electricity demand of millions of homes.
The increase in energy use is directly linked to the model’s size and complexity. Experts believe GPT-5 is significantly larger than its predecessors, with a greater number of parameters. This aligns with a study by French AI company Mistral, which found a strong correlation between a model’s size and its energy consumption. A model 10 times bigger, the study concluded, will have an impact that is an order of magnitude larger. This principle seems to be holding true for GPT-5, with some experts suggesting its resource use could be “orders of magnitude higher” than even GPT-3.
Compounding the issue is the new model’s architecture. While it does use a “mixture-of-experts” system to improve efficiency, its reasoning capabilities and ability to handle video and images likely counteract these gains. The “reasoning mode,” which involves the model computing for a longer time before generating a response, could make its resource footprint several times greater than text-only operations. This combination of size, complexity, and advanced features paints a clear picture of an AI system with a voracious appetite for power, leading to urgent calls for greater transparency from OpenAI and the broader AI industry.