Technology

Meta and Sama Face Legal Action in Kenya for Alleged Poor Work Conditions

Meta and Sama Face Legal Action in Kenya for Alleged Poor Work Conditions

If Meta and Sama, its major subcontractor for content moderation in Africa, do not answer 12 requests on workplace standards placed before them, they will be sued in Kenya for allegedly hazardous and unjust working circumstances. In a demand letter seen by TechCrunch, Nzili and Sumbi Advocates, the law firm representing Daniel Motaung, a former Sama employee who was fired in 2019 for organizing a strike over poor working conditions and pay, accused the subcontractor of violating various rights, including the health and privacy of Kenyan and international employees.

Motaung was reportedly fired for organizing the strike and attempting to organize Sama workers into a union. Meta and Sama have been given 21 days (beginning Tuesday, March 29) to answer to the demands or risk legal action. The legal firm demanded that Meta and Sama follow all labor, privacy, and health regulations in the nation, as well as recruit certified and experienced health specialists and give enough mental health insurance and higher remuneration to the moderators.

Oversight Board Calls on Meta to Evaluate Its Role in Spreading Violence in Ethiopia
Meta and Sama Face Legal Action in Kenya for Alleged Poor Work Conditions

“Facebook outsources the majority of this labor to firms like Sama, a system that maintains Facebook’s profit margins high at the expense of the health and safety of thousands of moderators throughout the world.” “Conditions that are hazardous, demeaning, and constitute a risk of post-traumatic stress disorder (PTSD) are reported by Sama moderators,” Motuang’s attorneys claimed.

The impending lawsuit follows a Time report detailing how Sama enlisted the moderators under the false pretense of hiring them for call center work. According to the narrative, the content moderators were employed from all around Africa and only discovered about the nature of their duties after signing employment contracts and relocating to the company’s headquarters in Nairobi.

The moderators trawl through social media messages across all platforms, including Facebook, to identify and delete individuals who spread hatred, disinformation, and violence. Employees are obliged to follow a number of rules, including not exposing the nature of their positions to others. According to the article, the remuneration of content moderators in Africa is the lowest in the world. Sama bills itself as an ethical AI company. Following the exposé, the company improved employee pay.

According to the legal firm, Sama failed to provide necessary psychosocial assistance and mental health measures to Motaung and his coworkers, including “unplanned pauses as needed, particularly following exposure to graphic content.” Sama’s employees’ productivity was also measured using Meta’s software, which measured employee screen use and mobility throughout work hours. “Thirty minutes a day with a wellness consultant,” Sama said. “Sama and Meta failed to adequately educate our client for the type of task he was about to undertake and its consequences. He recalls the first video he moderated being of a beheading. “No psychological help had been supplied to him in advance up to that moment,” the law firm stated.

Sama denied any wrongdoing in a blog post released following the exposé, claiming that it is open in its recruiting process and has a culture that “prioritizes employee health and wellness.” “We recognize that content moderation is a challenging but necessary job in ensuring the internet’s safety for everyone,” Sama added. “That’s why we spend substantially in training, personal development, and wellness initiatives.” “We take pleasure in our commitment to be truthful and honest as a global technology firm, partner, and employer. It is totally untrue to claim that Sama personnel were employed on the basis of fraudulent information or were given inaccurate information about content moderation duties.”

“I use Facebook, like many Kenyans, and it’s a vital platform to debate the news,” Mercy Mutemi, who is organizing the legal case, said. But that is precisely why this case is so crucial.” “A Facebook that is adequately staffed and where content moderators, the front-line workers against hate and disinformation, have the assistance they need to safeguard us all is critical to the safety and integrity of our democratic process in Kenya.” This isn’t a typical labor dispute; Facebook moderators’ working conditions affect all Kenyans.”