Meta has announced a new initiative to help youngsters avoid having their intimate photos suddenly blasted online, with both Facebook and Instagram joining the ‘Take It Down’ program – a new process created by the National Center for Missing and Exploited Children (NCMEC), providing youngsters to safely detect and action images of themselves on the web.

Shielded

Take It Down enables users to create digital signatures of their images, which can then be used to search for copies online. As per Meta:

“People can go to TakeItDown.NCMEC.org and follow the instructions to submit a case that will proactively search for their intimate images on participating apps. Take It Down assigns a unique hash value – a numerical code – to their image or video privately and directly from their own device. Once they submit the hash to NCMEC, companies like ours can use those hashes to find any copies of the image, take them down, and prevent the content from being posted on our apps in the future.”

Meta says that the program will allow both young people and parents to action concerns, providing more reassurance and safety, without compromising privacy by asking them to upload copies of their images, which could cause more angst.

Meta has been working on a version of this program for the last two years, even launching an initial version of this detection system for European users back in 2021. Meta launched the first stage of the same with the NCMEC last November, ahead of the school holidays, with this new announcement formalizing their partnership and expanding the program to more users.

This marks the latest in Meta’s ever-expanding range of tools designed to protect young users, while also defaulting youngsters into more stringent privacy settings, thus limiting their capacity to make contact with ‘suspicious’ adults. Of course, kids these days are getting more tech-savvy, circumventing many of these new precautions. Even so, there are extra parental supervision and control options, it’s just that many people don’t switch from the defaults, despite the availability.

Addressing the distribution of intimate images is a key concern for Meta, specifically, with research showing that the vast majority of online child exploitation reports shared by the NCMEC were found on Facebook, back in 2020.

The Wrap

Meta has continued to develop its systems to improve on this front, but its most recent Community Standards Enforcement Report did show an uptick in ‘child sexual exploitation’ removals, which Meta attributes to improved detection and ‘recovery of compromised accounts sharing violating content’. Whatever the case, numbers show that this remains a significant concern that Meta needs to address, which is why it’s good to see it teaming up with the NCMEC on this new project.

Sources

http://bit.ly/3ZgR3EX