According to Bloomberg, a TikTok moderator has sued the social media network and its parent company ByteDance for causing distress due to violent videos. Moderator Candie Frazier claimed in a proposed class-action lawsuit that she had screened films depicting violence, school shootings, deadly falls, and even cannibalism.
“Plaintiff has difficulty sleeping and has awful dreams when she does sleep,” according to the lawsuit. TikTok apparently asks moderators to perform 12-hour shifts with only a one-hour lunch and two 15-minute breaks, adding to the problem.
According to the complaint, “because to the overwhelming volume of content, content moderators are limited to 25 seconds each video and must concurrently view three to ten films.” Plaintiff has problems sleeping and has terrifying dreams when she does.
TikTok produced standards to assist moderators to deal with child abuse and other horrific pictures, in collaboration with other social media firms such as Facebook and YouTube. Companies should restrict moderator shifts to four hours and give psychological assistance, according to the recommendations. According to the lawsuit, TikTok, on the other hand, allegedly failed to follow such restrictions.
Content moderators bear the weight of gruesome and horrific pictures that post on social media, ensuring that consumers are spared the anguish. In a consent form, one company that supplies content moderators for huge tech companies even stated that the employee might trigger post-traumatic stress disorder (PTSD).
Social media businesses, on the other hand, have been chastised by its moderators and others for not paying enough attention to the psychological risks and for not offering adequate mental health care. In 2018, a similar case was brought against Facebook. Frazier is pursuing a class-action lawsuit on behalf of other Tiktok screeners, seeking compensation for psychological impairments as well as a court order for a medical fund for moderators. Frazier, the Tiktok content moderator, expects to represent other Tiktok screeners in a class-action lawsuit, seeking compensation for the emotional anguish she has experienced because of the violent movies, as well as a court order establishing a medical fund for moderators.
TikTok, like other social media platforms like Facebook and YouTube, has set rules to assist moderators in dealing with photos of child abuse and other terrible imagery. Companies could restrict moderator shifts to four hours and give psychological assistance, according to some ideas. TikTok, however, was unable to execute these standards, according to the proceedings.