Roblox Enhances Age-Verification System, Partners with International Rating Coalition for Safer Gaming Experience
In response to ongoing legal disputes concerning child safety, online gaming platform Roblox has unveiled plans to enhance its age verification technology for all users. The company aims to collaborate with the International Age Rating Coalition (IARC) to implement age and content ratings for games and applications on its platform by year’s end.
Roblox intends to integrate this age-estimation technology into its communication tools, including voice and text-based chat, by scanning users’ selfies and analyzing facial features to determine age. This system will complement existing measures such as ID age verification and verified parental consent, offering a more reliable estimation of a user’s age compared to the traditional method of having children input their birth year during account creation.
Additionally, Roblox is developing additional systems designed to restrict communications between adults and minors on its platform. The company will also replace its current content and maturity labels with those used by international rating agencies, providing a unified system for users worldwide.
These updates follow earlier initiatives introduced in July aimed at enhancing protections for younger users. Roblox implemented an age-verification system that analyzes user ages through video selfies to prevent underage users from accessing certain platform features. The company also restricted the ability of users aged 13 to 17 from adding strangers as “trusted connections” unless they have established a real-life relationship.
Roblox has been subject to criticism over child safety concerns, with complaints filed by various authorities and lawsuits across multiple states. In an effort to address these issues, the platform will roll out increasingly stringent laws and regulations worldwide that mandate user age verification, such as the U.K.’s Online Safety Act and Mississippi’s age assurance law. Similar legislation is in progress in other states like Arizona, Wyoming, South Dakota, and Virginia.
Roblox has invested significantly in safety features over the years, developing tools like Roblox Sentinel – an AI system designed to detect early signs of child endangerment. The platform also offers parental controls, communication restrictions, and technology that identifies servers with a high number of rule-breaking users for removal.
Despite these efforts, reports suggest that child predators have still managed to access the platform and target children. A research study by The Guardian highlighted that children on Roblox can still encounter inappropriate content and interact with harmful individuals. Furthermore, a popular farming simulator game on the platform, Grow a Garden, has been under scrutiny due to players trading virtual items for real money, in violation of the platform’s rules, leading to concerns about the game luring in children and encouraging them to spend money to keep up with others.
These changes to the ratings system may not completely eliminate such experiences, but they should provide parents with a clearer understanding of the games their children are playing. In a prepared statement, Matt Kaufman, Roblox’s chief safety officer, expressed the company’s commitment to creating a safe platform for users and supporting parents in making informed decisions about their children’s online activities: “We’re excited to partner with IARC and hope it will provide parents globally with more clarity and confidence regarding age-appropriate content.”