Pinterest just published its latest Transparency Report, which outlines all of the content removals and other moderation actions it took throughout the first half of 2022. Surprisingly, there are some interesting shifts within these figures, especially in one of the charts featured, which measures the number of Pins removed in each category over the past year. 

You’re Outta Here! 

It’s probably safe to assume that these are some pretty erratic shifts, specifically: 

  • Removals for adult sexual services were way up in Q1 2022, but then normalized in Q2. Why? Pinterest says that this was due to ’hybrid deactivation of a small handful of images, which account for almost two-thirds of Pins deactivated in Q1 for violating this policy’.
  • Child sexual exploitation removals were way up in Q2 2022. Pinterest says that this was due to an update in its detection systems.
  • Conspiracy theory removals are way down, due to a mass clean-up in 2021.
  • Dangerous goods removals also way down – also following a ‘sweeping clean-up’ in 2021.
  • Graphic violence and threat removals are up this year – ’in part because of the content related to the war in Ukraine.’
  • Medical Misinformation removals way up in Q2 2022.
  • Self-injury and harmful behavior removals are way, way up as a result of Pinterest’s ongoing work to improve its detection and reporting processes.
  • Spam and harassment relatively steady, with seasonal peaks.

It would seem that these numbers also tend to fluctuate a lot, based on different updates and approaches. What’s interesting to consider is what this means for overall Pin activity and how the platform is working to safeguard its users. Does this recent data suggest that there has long been more child exploitation and self-injury content in the app that only Pinterest has recently detected? It’s good that it’s able to improve and refine its systems, but it also means that a lot of these materials have been long active on the platform. 

Pinterest claims that the majority of violative material is removed before anyone even sees it – again, Pinterest must continue to improve on this front. What’s worth noting here is that at least some of the worst kinds of content had been viewable in the app for some time, before these more recent updates. True enough, back in May, DailyDot reported that users had discovered various child grooming accounts on Pinterest, earning the platform a little extra attention, at least on this element. Pinterest also recently issued an apology to the family of Molly Russell, a 14-year-old schoolgirl who took her own life because of online bullying, who also viewed self-harm content on the app. 

The Wrap

Again, it’s a positive that Pinterest has responded to such cases and upped its enforcement in these areas, but it could also suggest that there are other areas that Pinterest isn’t as stringent on, as they haven’t been a source of media attention as yet. However, that’s also speculative. It would seem media pressure has irked Pinterest enough for it to take action on certain elements, but it’s impossible to know the extent of each without additional oversight. Pinterest is doing good, no doubt, but it still has a bit of work to do.

Sources

http://bit.ly/3Onb9c1