Roblox has rolled out Artificial Intelligence-powered facial age estimation to limit chat matches to users of similar ages as part of broader child safety efforts. The feature was raised during a recent interview with The New York Times in which chief executive David Baszucki discussed how the platform plans to protect minors while evolving the user-generated content environment. Much of the interview focused on how these new safeguards will affect day-to-day interactions for children on the platform.
When asked about future monetization features, including microtransactions, loot boxes, and what the interviewer characterized as “kid gambling,” Baszucki responded that “we would have to do that.” He later described such mechanics as “a brilliant idea if it can be done in an educational way that’s legal,” adding that any approach should avoid getting children financially involved in gambling while potentially mimicking aspects of the mechanics. The comments link the company’s product roadmap to ongoing debates about how to combine revenue systems with child safety controls.
The interview noted research indicating that gambling mechanics in video games are associated with a higher risk of addiction in adulthood, especially when games use real gambling elements such as casino sounds or visuals. The article also observes that most loot box systems examined in research include a monetary component, which complicates safety and regulatory concerns. Beyond addiction risk, the piece highlights broader criticism that microtransactions can be manipulative and potentially predatory, framing the discussion Roblox faces as it balances monetization, user experience, and child protection measures.
