Roblox enables age verification for chat access amid child safety concerns

Roblox will roll out camera-based age estimation to gate chat between minors and adults, initially optional and becoming mandatory globally by January. The checks use Persona and Artificial Intelligence and will begin in Australia, New Zealand, and the Netherlands in early December.

Roblox is introducing camera-based age checks to restrict chat interactions between children and adults as part of a renewed focus on child safety. The company previously added parental controls and is now moving to stricter limits on who can communicate with minors. The new measures will be voluntary at first and then required globally by January, with forced checks starting in Australia, New Zealand, and the Netherlands in the first week of December.

When users opt in or once checks become mandatory, Roblox will prompt them to use their device camera for an age estimation. The process assigns users to age groups and restricts chats to people in similar brackets. Minors will only be permitted to chat with adults after obtaining parental consent. The example provided by Roblox indicates some overlap between adjacent teen age groups, which may allow limited cross-group interactions for older minors.

The age verification system is powered by Persona, the same provider handling age checks for Reddit in the UK, and it relies on Artificial Intelligence to estimate a user’s age from camera input. Reaction to the announcement has been mixed. Some players and parents worry it will hamper coordination in strategy and role-playing games that rely on cross-age communication, while others express privacy concerns but acknowledge the need for a safety mechanism. Roblox framed the change as a necessary step to better protect minors, balancing platform usability with child safety requirements.

55

Impact Score

UK mps open inquiry into artificial intelligence and edtech in education

UK mps have launched a cross party inquiry into how artificial intelligence and education technology are reshaping learning across early years, schools, colleges and universities, and how government should balance innovation with safeguards. The education committee will examine opportunities to improve teaching and workload alongside risks around inequality, privacy, safeguarding and assessment.

Most UK firms see Artificial Intelligence training gap as shadow tool use grows

New research finds that 6 in 10 UK businesses say employees lack comprehensive Artificial Intelligence training, even as shadow use of unapproved tools becomes widespread and investment surges. Executives warn that without stronger skills, governance and strategy, many organisations risk missing out on expected Artificial Intelligence returns.

COSO issues internal control roadmap for governing generative artificial intelligence

COSO has released governance guidance that applies its Internal Control-Integrated Framework to generative artificial intelligence, offering audit-ready control structures and implementation tools for organizations. The publication details capability-based risk mapping, aligned controls, and practical templates to help institutions manage emerging technology risks.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.