Google still not giving full picture on artificial intelligence energy use

Google reported a per-query energy estimate for Gemini, but that single figure covers only text prompts and leaves out image and video generation, total query volumes, and other embedded uses of Artificial Intelligence.

Google recently published a report estimating that a typical text query to its Gemini app uses about 0.24 watt-hours of electricity, a median value the company compared to running a microwave for one second. The author welcomes the transparency but warns that the single number is easily misinterpreted as evidence that Artificial Intelligence energy demand is negligible. The report applies only to text prompts and reflects a middle point in a distribution of queries rather than an average across all usages.

The article outlines several important gaps in Google’s disclosure. Google’s figure excludes image and video generation, which prior reporting suggests typically consume more electricity, and Jeff Dean, Google’s chief scientist, said the company does not currently plan a comparable analysis for images and videos. The median estimate also avoids conveying how longer prompts, longer responses or reasoning models push energy use higher. Google declined to share the total number of Gemini queries, pointing instead to a monthly active user figure of 450 million and saying it is not comfortable revealing total queries. By contrast, OpenAI has reported 2.5 billion daily queries to ChatGPT and an average of 0.34 watt-hours per query; the article calculates that level of demand would amount to over 300 gigawatt-hours annually, comparable to powering nearly 30,000 US homes.

The piece emphasizes that Artificial Intelligence is embedded across services and devices beyond chatbots, from search summaries to email drafting, so even conscientious users will find it difficult to tally personal energy impact. The author does not call for personal guilt but for broader reporting and accountability from major providers. The article also highlights broader grid implications cited in reporting, including that over two gigawatts of natural gas may be needed in Louisiana for a single Meta data center and a projection that by 2028 Artificial Intelligence could account for 326 terawatt-hours of US electricity demand and more than 100 million metric tons of carbon dioxide. The article notes that Google Cloud spending on AI in the PJM grid was mentioned but the amount was not stated.

72

Impact Score

IBM runs quantum error-correction algorithm on AMD FPGAs ahead of schedule

IBM executed a real-time quantum error-correction algorithm on AMD field-programmable gate arrays, delivering performance 10 times faster than required for live correction. The demo signals earlier-than-planned progress on IBM’s 2029 Starling quantum system roadmap while lowering costs with off-the-shelf hardware.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.