Estimating the energy use and emissions burden of Artificial Intelligence technologies is fraught with uncertainty, largely due to a lack of standardized measurement methods, limited transparency from companies, and the complex nature of modern power grids. Unlike assessing the fuel efficiency of vehicles or appliances, there is no universally accepted framework or regulatory mandate for reporting Artificial Intelligence energy consumption, and most major firms keep these figures proprietary. This opacity stands in stark contrast to the billions of dollars being invested to build out new energy infrastructure, much of it geared towards supporting Artificial Intelligence development and deployment.
To overcome these barriers, researchers and journalists have focused on open-source models as imperfect, but the best-available, proxies for measuring true Artificial Intelligence energy demand. Energy consumption is measured in two main stages: training and inference, with inference now representing the largest share of ongoing usage as the proliferation of generative models accelerates. Experts such as the ML.Energy and AI Energy Score teams have developed tools and methodologies to assess power drawn by GPUs during processing, then extrapolated overall server demand, factoring in ancillary components like CPUs and cooling systems. Experiments on popular models like Meta´s Llama for text, Stable Diffusion for images, and CogVideoX for video highlight the variation in energy requirements based on model size, number of inference steps, and hardware used. For image and video generation, additional assumptions—such as the ‘50 percent GPU’ rule—help fill research gaps, though they may not always reflect reality across all types of models.
Translating electricity usage into emissions introduces further complexity, as the carbon intensity of power drawn by data centers varies not only nationally but regionally and even by the time of day. Detailed, region-specific emissions data from platforms like Electricity Maps provide a more nuanced understanding, incorporating life-cycle emissions and accounting for energy imports and exports between U.S. grid balancing authorities such as ERCOT, PJM, and CAISO. These distinctions matter, because data centers are disproportionately located in a handful of key regions. Contextual metrics—like microwave minutes powered or e-bike miles enabled—help demystify the raw numbers, while comparisons to miles driven in gasoline vehicles frame the broader climate impact. Despite deep analysis and clear evidence of growing energy strain, the lack of rigorous reporting standards, limited access to proprietary data from companies like OpenAI, Microsoft, and Google, and an absence of granular industry-wide emissions tracking make accurate estimation at scale elusive. As demand for Artificial Intelligence grows, so does the urgency for transparency, standardized tracking, and regulation to illuminate its substantial and fast-rising carbon footprint.