Generative Artificial Intelligence´s Environmental and Societal Impacts Highlighted in GAO Report

A GAO assessment reveals significant environmental and human effects of generative Artificial Intelligence, urging policymakers to address resource use, labor changes, and risks linked to these rapidly advancing technologies.

The U.S. Government Accountability Office (GAO) has released a comprehensive report evaluating the far-reaching effects of generative Artificial Intelligence on environmental resources and human society. While generative Artificial Intelligence promises transformative productivity and innovation across multiple industries—ranging from enhanced customer service automation to advanced content creation—the technology relies heavily on substantial energy and water inputs. Despite its widespread adoption, disclosure and monitoring around generative Artificial Intelligence’s electricity and water use remain limited, making it difficult to fully gauge its environmental footprint.

The report highlights that estimates of energy use have centered on how much power is consumed during the training of large generative Artificial Intelligence models, and the resulting carbon emissions. The generative Artificial Intelligence boom is a major factor behind growing demand for datacenters, which the GAO notes could account for as much as 6% of U.S. electricity consumption by 2026, up from 4% in 2022. However, concrete figures for how much of this usage directly results from generative Artificial Intelligence remain elusive, as companies frequently do not release granular data—particularly regarding water consumption for cooling systems.

In addition to environmental risks, generative Artificial Intelligence presents several human-scale challenges. These include job displacement, the proliferation of misinformation (such as deepfakes), increased cybersecurity concerns, and potential threats to personal safety. The GAO identified five categories of human effects, emphasizing difficulties in providing definitive risk assessments due to the technology’s rapid evolution and the lack of full transparency from private developers. To address these intertwined challenges, the report outlines policy options: maintaining current practices; improving industry data collection and disclosure; encouraging innovation for more efficient algorithms and hardware; promoting the adoption of risk management frameworks; and sharing best practices or developing standards. The GAO advocates for a combination of these actions by legislators, regulators, industry, and research institutions to better understand and balance the benefits and risks of generative Artificial Intelligence technologies as development accelerates.

75

Impact Score

UK mps open inquiry into artificial intelligence and edtech in education

UK mps have launched a cross party inquiry into how artificial intelligence and education technology are reshaping learning across early years, schools, colleges and universities, and how government should balance innovation with safeguards. The education committee will examine opportunities to improve teaching and workload alongside risks around inequality, privacy, safeguarding and assessment.

Most UK firms see Artificial Intelligence training gap as shadow tool use grows

New research finds that 6 in 10 UK businesses say employees lack comprehensive Artificial Intelligence training, even as shadow use of unapproved tools becomes widespread and investment surges. Executives warn that without stronger skills, governance and strategy, many organisations risk missing out on expected Artificial Intelligence returns.

COSO issues internal control roadmap for governing generative artificial intelligence

COSO has released governance guidance that applies its Internal Control-Integrated Framework to generative artificial intelligence, offering audit-ready control structures and implementation tools for organizations. The publication details capability-based risk mapping, aligned controls, and practical templates to help institutions manage emerging technology risks.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.