Meta’s looking to improve its child protection measures in VR, by prompting all users to confirm their age via its Meta Quest Platform.

all Quest users will soon be prompted to provide confirmation of their age, either via credit card/government ID for those over 13, or through confirmation from a parent.

As explained by Meta:

“Whether you’re trying to survive a zombie apocalypse, catching a concert with friends or visiting the International Space Station with your family, there’s something for everyone on the Meta Quest platform. Understanding the ages of people on Meta Quest helps both us and developers provide the right experience, settings, and protections for teens and preteens.”

Meta has already implemented various measures to protect younger users, including defaulting those under 18 into private profiles, and providing parental supervision tools for teens aged 13-17.

Users aged between 10 and 13 are only able to use a “preteen account”, which is managed by their parents:

“Preteen profiles default to private, and the activity, active status, and app-in-use are likewise set to private, with parents given control over these settings. Additionally, parents control whether their preteen can download or use an app, and parents can block access to specific apps at any time.”

This new push will provide additional assurance about user ages, with users required to confirm their age within 30 days, or face restrictions.

It’s an important update, especially given the potential harm of more immersive VR experiences. Indeed, Meta has already been forced to add in personal boundaries for VR avatars after reports of sexual harassment, and even “virtual rape” in its VR environment.

What’s more, we don’t have enough data yet on the extent of potential harms that could be caused by more realistic simulated environments.

In some ways, the evolution of VR reflects the development of social media in this respect, with most of the impacts of social media engagement only identified in retrospect, and largely, in many cases, too late.

Over time, more and more studies have shown that social media interaction can have harmful impacts for youngsters, and can be a net negative for development, mental health, and more. On balance, factoring this in, we should never have allowed young kids to be using social media apps, with the exposure risk alone posing significant dangers.

VR now poses similar risks, and while it is good to see Meta looking to implement more measures, this does, for the most part, feel like surface-level protections, that can easily be subverted by teens looking to engage in inappropriate material.