Reporters set out to find a single basic figure: how much energy leading models use to generate a single response. That number proved elusive because companies hold the data. After months of reporting, OpenAI’s Sam Altman in June said an average ChatGPT query uses 0.34 watt-hours and Google in August reported about 0.24 watt-hours for a Gemini question, while the French startup Mistral released an emissions estimate. Those public figures aligned roughly with the article’s earlier estimates for medium-size models, but they arrived in blog posts and company statements rather than detailed technical disclosures.
Sources told the reporters the published numbers are vague and limited. OpenAI did not attach a technical paper to its figure, leaving model identity, measurement method and variance unspecified. Google’s figure reports a median per-query number, which hides higher-energy responses when models run more complex reasoning or produce long outputs. The published measures also apply only to chat interactions and do not cover other modalities. Sasha Luccioni of Hugging Face emphasized the need for energy and emissions data for images and video as those uses grow. The per-query totals are small on their face, comparable to a microwave running for seconds, which helps explain why researchers do not treat individual use as a major direct climate burden.
The reporting also highlights tensions between rising infrastructure demand and corporate sustainability commitments. Microsoft disclosed that emissions rose more than 23 percent since 2020, driven in part by artificial intelligence, even as it pledges net-negative goals. Companies argue artificial intelligence will unlock efficiencies that offset its energy use, but the article notes a lack of transparent evidence that such benefits currently outweigh rapid electricity demand growth. Anecdotes, such as detecting methane hotspots, have not been quantified to show net reductions versus new emissions from expanded compute.
Perhaps the largest uncertainty is future demand. OpenAI reports 2.5 billion prompts per day, and a Lawrence Berkeley National Laboratory projection suggests that continued growth could, by 2028, push artificial intelligence to use as much electricity annually as 22 percent of US households. At the same time, signs of slowdown appeared this summer: GPT-5’s lackluster debut and an MIT finding that 95 percent of businesses see no return on massive artificial intelligence investments raised questions about whether current data center buildouts will be justified. The core unknown is not just per-query energy but whether adoption will scale to meet industry expectations or whether the boom will prove short lived.