Unveiling the true scale of artificial intelligence´s energy consumption

Investigating the electricity required to power artificial intelligence reveals staggering implications for emissions, data centers, and the future of machine learning.

The recent investigation into the energy and emissions footprint of artificial intelligence uncovers a complex and much-debated landscape. Journalists Casey Crownhart and a co-author spent months evaluating the electricity used in basic artificial intelligence tasks like chatbot conversations, image generation, and especially video creation. Their work reveals not only the surprising magnitude of energy required for each interaction, but also the uncertainty and lack of transparency surrounding industry data. High-profile leaders and policymakers are taking notice, as artificial intelligence´s demand for electricity promises to reshape national and global power grids, potentially requiring enough energy to supply more than a fifth of US households within three years.

While the team measured current usage, it´s clear these numbers represent only the beginning. Artificial intelligence is still in its early stages, with innovation rapidly pushing toward more complex reasoning models, ever-present hardware assistants, and agents acting on users’ behalf. Each advancement will likely drive energy consumption higher. However, there’s hope that underlying systems—models, chips, and cooling technologies—could become more efficient, mitigating future impacts. Despite this, the data highlights troubling trends: video generation, in particular, was found to use orders of magnitude more power than chatbot queries, with a single five-second artificial intelligence video requiring enough energy to run a microwave for over an hour. Yet companies like Google and OpenAI declined to disclose specific energy consumption figures for their flagship artificial intelligence video models, deepening the opacity around the sector´s true impact.

The discussion shifts away from individual responsibility toward broader questions about infrastructure and regulatory choices. The personal footprint from asking an artificial intelligence chatbot for advice might be negligible. However, the aggregate effect—draining water sources for cooling data centers, increasing reliance on fossil fuels despite promises of clean energy, and wrapping energy policy around the needs of a few tech giants—raises urgent environmental and ethical questions. Researchers emphasize that transparency from tech firms is sorely lacking, leaving the public ill-equipped to weigh the real costs of artificial intelligence adoption or to hold companies accountable. As artificial intelligence expands, attention must turn to policy, resource impacts, and systemic change rather than just user behavior.

85

Impact Score

Samsung to supply half of NVIDIA’s SOCAMM2 modules in 2026

Hankyng reports Samsung Electronics has secured a deal to supply half of NVIDIA’s SOCAMM2 modules in 2026 for the Vera Rubin Superchip, which pairs two ‘Rubin’ Artificial Intelligence GPUs with one ‘Vera’ CPU and moves from hardwired memory to DDR5 SOCAMM2 modules.

NVIDIA announces CUDA Tile in CUDA 13.1

CUDA 13.1 introduces CUDA Tile, a virtual instruction set for tile-based parallel programming that raises the programming abstraction above SIMT and abstracts tensor cores to support current and future tensor core architectures. The change targets workloads including Artificial Intelligence where tensors are a fundamental data type.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.