Roblox, a gaming app used by nearly half of the entire U.S. population of under-16s, has rolled out a new mandatory safety feature to put a stop to children communicating with adults on the platform.
Starting on January 7, players in the U.S. were required to submit to facial age estimation via the app to access the chat feature, although age verification remains optional to play the games themselves.
Users in the U.K., Australia, New Zealand, and the Netherlands are already required to complete an age check to chat with other users, but the requirement will now roll out to the U.S. and beyond.
The verification is being processed by a third-party vendor, Persona. Once the age check is processed, Roblox says it will delete any images or videos of users.
If the age-check process incorrectly estimates a user’s age, the decision can be appealed and the child’s age verified through alternative methods. Users 13 or older may also opt for ID-based checks.
Once users complete the age check, they are assigned to one of six age groups (under 9, 9-12, 13-15, 16-17, 18-20, and 21+). Users can only communicate with players directly above and below their own age group. For example, a 9-year-old cannot chat with users older than 15, and a 16-year-old can only chat with those ages 13 to 20.
The feature is designed to prevent children younger than 16 from communicating with adults. About 42% of Roblox users are younger than 13.
“As the first large online gaming platform to require facial age checks for users of all ages to access chat, this implementation is our next step toward what we believe will be the gold standard for communication safety,” wrote Matt Kaufman, Roblox’s chief safety officer, and Rajiv Bhatia, its head of user and discovery product, in a blog post.
Parental consent is still required for users younger than 9 to access chat features, while age-checked users 13 and older can chat with people they know beyond their immediate age group via the Trusted Connections feature.
“Leveraging multiple signals, [Roblox is] constantly evaluating user behavior to determine if someone is significantly older or younger than expected,” the company execs continued. “In these situations, we will begin asking users to repeat the age-check process.”
The face scan is launching as the company faces increased scrutiny over child safety on the app. Attorneys general around the country are investigating Roblox, and nearly 80 active lawsuits accuse Roblox of enabling child exploitation, with some parents alleging their children encountered predators on the app.
source https://www.fastcompany.com/91472515/roblox-gaming-child-safety-adults
Discover more from The Veteran-Owned Business Blog
Subscribe to get the latest posts sent to your email.
You must be logged in to post a comment.