California governor signs Artificial Intelligence safety law SB 53, mandating transparency and whistleblower protections

Governor Gavin Newsom signed SB 53, requiring frontier Artificial Intelligence developers to publicly disclose risk protocols and report critical safety incidents. The law also protects whistleblowers and seeds a public compute consortium to support safe research.

California has enacted a sweeping new framework for overseeing advanced Artificial Intelligence, with Governor Gavin Newsom signing SB 53 into law. The measure compels major developers to publicly disclose how they plan to mitigate potentially catastrophic risks posed by advanced models, establishes mechanisms for reporting critical safety incidents, and extends whistleblower protections to Artificial Intelligence company employees. The law also launches CalCompute, a government consortium tasked with building a public computing cluster to support safe, ethical, and sustainable Artificial Intelligence research and innovation. Newsom framed the move as a balance between safeguarding communities and fostering innovation.

Authored by state senator Scott Wiener, SB 53 follows a failed bid last year to pass a stricter liability-focused bill, SB 1047, which Newsom vetoed. The new law emphasizes transparency over liability and includes civil penalties for noncompliance enforceable by the state attorney general. Supporters argue the approach targets the most capable developers while sparing startups from disproportionate burden. Sunny Gandhi of Encode AI called it a win for both California and the industry, contending that the framework ensures accountability for the most powerful models without stifling smaller players.

Reactions from industry leaders were mixed but leaned supportive. Anthropic cofounder Jack Clark praised the transparency requirements for frontier developers and said the framework balances public safety with innovation, while noting the importance of eventual federal standards. OpenAI, which did not endorse the bill, said it was pleased California created a path toward harmonization with the federal government, and Meta called the law a positive step toward balanced regulation. Critics raised concerns about unintended consequences: Andreessen Horowitz’s Collin McCune warned the law could entrench incumbents and burden startups with a patchwork of state regimes. Former OpenAI policy research lead Miles Brundage said SB 53 is a step forward but argued for stronger minimum risk thresholds, more substantive transparency, and robust third-party evaluations, noting that the law’s penalties are weaker than those in the EU’s Artificial Intelligence Act.

Backers counter that startup fears are overstated. Thomas Woodside of Secure AI Project, a cosponsor, emphasized that the law targets companies training models with compute budgets in the hundreds of millions of dollars and sets reporting requirements for serious incidents, alongside whistleblower protections and basic transparency. He added that several obligations do not apply to firms below a revenue threshold. Although a state statute, SB 53 will likely have global implications given that 32 of the world’s top 50 Artificial Intelligence companies are based in California. The law’s incident reporting to California’s Office of Emergency Services, public disclosures, and enforcement by the attorney general position the state to shape oversight standards for OpenAI, Meta, Google DeepMind, Anthropic, and other major players.

75

Impact Score

Scientists track permafrost thaw from space to guide Arctic planning

Researchers are using radar satellites to map seasonal ground subsidence and infer deep ice content, turning space data into practical guidance for communities and militaries coping with thawing permafrost. Early results in Alaska are informing relocation and infrastructure decisions as warming accelerates risks.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.