A surveillance mandate disguised as child safety: why the GUARD Act won’t keep us safe

The GUARD Act would force many companies offering Artificial Intelligence chatbots to verify users’ ages, bar minors, and impose criminal penalties, but the bill’s age-gating and data rules risk mass surveillance, censorship, and lost access to everyday tools.

A new bill introduced by Sen. Hawley (R-MO), Sen. Blumenthal (D-CT), Sen. Britt (R-AL), Sen. Warner (D-VA), and Sen. Murphy (D-CT) would require Artificial Intelligence chatbots to verify every user’s age, prohibit minors from using those tools, and create criminal penalties for systems that promote or solicit certain harms. While presented as child-safety legislation, the GUARD Act functions primarily as an age-gating and data-collection mandate that could be applied to a broad range of public-facing services, from customer-service bots to search assistants.

The bill would not allow parental consent or appeals: if an age-verification process determines a user is under 18, that user must be locked out entirely. Its ambiguous definition of an Artificial Intelligence “companion” could be read to include general-use large language models like ChatGPT and many other adaptive systems, prompting companies to shut teenagers out of tools they use for homework, customer service, and creative or educational work. By treating all young people the same, whether seven or seventeen, the law would limit confidential access to information and reduce opportunities for autonomy and learning.

To enforce the ban, platforms would have to deploy a “commercially reasonable” age-verification system that gathers identifying information such as government ID, credit records, or biometric data and periodically re-verifies users. That requirement forces the retention or repeated collection of sensitive data. The Electronic Frontier Foundation and other advocates warn that such centralized identity systems become attractive targets for hackers, destroy online anonymity, disproportionately harm vulnerable groups, and favor large incumbents that can afford compliance costs.

Definitions in the bill are broad: an “Artificial Intelligence chatbot” could cover any service producing adaptive or context-responsive outputs and an “Artificial Intelligence companion” could include systems that simulate interpersonal interaction. Combined with incredibly steep fines – up to $100,000 per violation, enforceable by federal and state attorneys general – the result will likely be widespread censorship, blanket bans on users under 18, or the construction of mass surveillance systems as a condition of access. Lawmakers should reject the GUARD Act and pursue privacy-first policies that improve transparency, accountability, and safer options without building invasive identity infrastructure.

68

Impact Score

OpenRouter highlights expanding roster of free artificial intelligence models

OpenRouter is expanding free access to high-end artificial intelligence models, aggregating open-weight and frontier systems from multiple providers under a single routing layer. The lineup targets agentic, long-context, multimodal, and code-centric workloads while keeping usage at $0/M input tokens and $0/M output tokens for listed models.

Physical artificial intelligence emerges as manufacturing’s next competitive edge

Manufacturers are moving beyond traditional automation toward physical artificial intelligence that can perceive, reason, and act in real factories, with Microsoft and NVIDIA positioning their technologies as the backbone for this shift. Trust, governance, and human oversight are presented as core requirements for scaling these systems safely.

Weird World column explores strange frontiers of science and society

Research in the Weird World: Science & Society section spans ethical risks of Artificial Intelligence therapy, ancient plagues decoded through DNA, climate shocks that reshaped civilizations, and other unconventional investigations at the edge of science and culture.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.