Roblox enables age verification for chat access amid child safety concerns

Roblox will roll out camera-based age estimation to gate chat between minors and adults, initially optional and becoming mandatory globally by January. The checks use Persona and Artificial Intelligence and will begin in Australia, New Zealand, and the Netherlands in early December.

Roblox is introducing camera-based age checks to restrict chat interactions between children and adults as part of a renewed focus on child safety. The company previously added parental controls and is now moving to stricter limits on who can communicate with minors. The new measures will be voluntary at first and then required globally by January, with forced checks starting in Australia, New Zealand, and the Netherlands in the first week of December.

When users opt in or once checks become mandatory, Roblox will prompt them to use their device camera for an age estimation. The process assigns users to age groups and restricts chats to people in similar brackets. Minors will only be permitted to chat with adults after obtaining parental consent. The example provided by Roblox indicates some overlap between adjacent teen age groups, which may allow limited cross-group interactions for older minors.

The age verification system is powered by Persona, the same provider handling age checks for Reddit in the UK, and it relies on Artificial Intelligence to estimate a user’s age from camera input. Reaction to the announcement has been mixed. Some players and parents worry it will hamper coordination in strategy and role-playing games that rely on cross-age communication, while others express privacy concerns but acknowledge the need for a safety mechanism. Roblox framed the change as a necessary step to better protect minors, balancing platform usability with child safety requirements.

55

Impact Score

Google Vids opens free video generation to all Google users

Google has made Google Vids available to anyone with a Google account, adding free access to video generation with its latest models. The move expands Google’s end-to-end video workflow and increases pressure on rivals that charge for similar tools.

Court warns against chatbot legal advice in Heppner case

A federal court found that chats with a publicly available generative Artificial Intelligence tool were not protected by attorney-client privilege or the work-product doctrine. The ruling highlights litigation risks when executives or employees use chatbots for legal guidance without lawyer supervision.

Newsom orders California to weigh Artificial Intelligence harms in contract rules

Gov. Gavin Newsom has signed an executive order directing California agencies to account for potential Artificial Intelligence harms in state contracting while expanding approved use of generative tools across government. The move follows a dispute involving Anthropic and reflects a broader split between California and the Trump administration on Artificial Intelligence oversight.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.