Following its first-ever Youth Safety and Well-Being Summit, held last month in Washington DC, Meta has called for global cooperation among governments to establish new, definitive requirements around key elements of online child safety, including provisions for access and detection, as well as rules around what is and isn’t acceptable content, particularly concerning social apps. 

Meta for Kids

Meta’s Youth Safety Summit brought together mental health educators, experts, parents, policy writers, and researchers, who held a series of discussions around the key issues relating to child safety online, and how to best address the evolving requirements of this key aspect.

Various reports have indicated the depth of the problem – from the mental impacts of negative self-comparison on Instagram to kids dying while trying out harmful challenges on TikTok. Social Media apps, while they do have age requirements, along with a variety of tools designed to block underaged users from logging in and being exposed to inappropriate material, most of these safeguards are easily bypassed. And with more kids now growing up online, they’re becoming increasingly savvy in evading such, more so than their parents might suspect.

Despite the known shortcomings, more advanced systems are already at play, including facial recognition access gating, along with more advanced age-estimation software, which, if you could guess, determines the age of the account holder based on a range of factors. Instagram is already working with third-party platforms on the latter, with Meta also noting that it has implemented a range of extra measures to detect and stop kids from accessing its apps. For the most part, Meta just doesn’t want to go it alone.

Meta has taken a similar approach to content regulation, implementing its own, external Oversight Board to provide critical feedback on its internal decisions, while also calling on governments to take note of this approach and establish more definitive rules for all online providers. This would take some weight off Meta’s shoulders, reducing overall scrutiny of the company, while also establishing universal requirements for all platforms, potentially improving safety overall. 

Meta’s specifically calls for regulation to address three key elements: 

  1. How to verify age: so that young children can’t access apps not made for them and that teens can have consistent age-appropriate experiences.

  2. How to provide age-appropriate experiences: so that teens can expect similarly safe experiences across all apps that are tailored to their age and life stage.

  3. How to build parental controls: so that parents and guardians have the tools to navigate online experiences for their teens together.

The Wrap

Meta notes that it will continue to develop its own approaches, but it would prefer to see more centralized, defining regulations to which all platforms have to adhere. Given the potential for harm online, this push does make sense, it’ll be interesting to see if this becomes a bigger talking point among UN member states over the coming months. 

Sources 

https://bit.ly/3Xm6V7W