Here’s something new, a TikTok moderator, on of its own, actually sued TikTok and parent company, ByteDance, due to trauma caused by graphic videos. Bloomberg catalogued the details in a report. Content Moderator Candie Frazier filed for a class-action lawsuit after having ‘screened’ several videos that show violence, school shootings, fatal falls, and even cannibalism. Yikes. The lawsuit states that the “Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares.”

Adding to the issue, TikTok allegedly requires its moderators to man 12-hour shifts, with only an hour’s lunch and two 15-minute breaks. Yeesh, where I’m from, even regular security personnel have better rest periods than that. A line from the complaint even reads “Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time.”

Along with other major social platforms, Including Facebook, Instagram, and YouTue, TkTok developed guidelines to help moderators cope with child abuse and othe traumtic and/or potentially very disturbing images. Among the suggestions is that companies limite moderator shifts to four hours, along with the provision of psychological support. According to the lawsuit, TikTok allegedly failed to implement said guidelines.

Any angle you look at it, content moderators are always more at risk and have more on their plate to deal with, given that it’s technically their job to take the brunt of graphic and traumatic images and clips, making sure that users don’t have to be marred by them. One company, in particular, that outsources content moderators for large tech firms even acknowledged in a consent form that the job does have the chance to cause Post-Traumatic-Stress-Disorder. In line with this, social media companies have indeed been criticized by their moderators for not paying enough, given the likely exposure to graphic violence and other forms of psychological hazards. They clamor that there also isn’t enough mental health support, or at least compensation on the side of their employers to help mitigate what damage has already been caused. A similar lawsuit has been filed against Facebook back in 2018.

What this instance depicts is that for all the good and success that a company or social platform exhibits, there are always a couple of dark blotches that are hidden that show the other half of things – internally, there might be numerous struggles and conflicts that, as this scenario revealed, might put people in precarious situations due to certain job descriptions exposing them to possibly harmful elements, without commensurate support and/or consolation.

Furthermore, it also shows us just how deep the fangs of influence, or at least ‘too much’ of it, can go. TikTok was once already criticized for having birthed several harmful challenge trends, and the net remains to primarily be universally unregulated, meaning that outside of local site protocols, nothing else really protects us from all the harmful stuff floating around online.

The Wrap

Hopefully, this instance wakes TikTok up before any further damage is done, and that no other cases like this ever happens in the future. The same applies to all social media sites and platforms – that above all else, the safety of people should always come first, visitors and employees alike.

Subscribe to our ‘Bottoms Up!’ Newsletter. Get the latest social media news, strategies, updates and trends to take your business to the highest level.


Sources

https://tcrn.ch/3JraPWG