On Tuesday, Facebook’s Oversight Board weighed in on an instance of misinformation in war-torn Ethiopia, warning the corporation about the dangers of allowing hate speech and unverified material to flow freely in crisis zones. The Oversight Board looked at a Facebook post in Amharic from an Ethiopian user who claimed the Tigray People’s Liberation Front (TPLF) was responsible for murder, rape, and theft in Raya Kobo and other Amhara area population centers with the help of Tigrayan civilians.
“While the user says his sources include earlier unidentified reports and persons on the ground,” the Oversight Board said in its examination, “he does not even present circumstantial evidence to back his statements.” “Rumors accusing an ethnic group’s complicity in mass crimes, such as those found in this piece, are harmful and enhance the likelihood of impending bloodshed dramatically.”
The message first identified by Facebook’s automated content moderation technologies and later deleted when the platform’s Amharic language content review team found that it broke the platform’s hate speech policies. After the case escalated to the Oversight Board, Facebook reversed its decision and reinstated the content.
The Oversight Board rejected Facebook’s decision to reinstate the post, citing a breach of the platform’s rules against violence and incitement rather than the platform’s hate speech standards. The panel voiced worry in its judgment that the propagation of unverifiable rumors in violent countries such as Ethiopia may “lead to horrific tragedies, as happened in Myanmar.”
A group of Rohingya refugees in the United States filed a $150 billion class-action lawsuit against Meta earlier this month, stating that Facebook’s entry into the nation was a “critical inflection point” in the Rohingya genocide.
Misinformation about the escalation of ethnic violence in Myanmar is widely spread on Facebook, often sown by military officers, and is widely seen as targeting and displacing ethnic minorities in Myanmar.
One of the platform’s major threats, according to Facebook whistleblower Frances Haugen, is algorithmically amplified ethnic violence in places like Myanmar and Ethiopia —Meta’s inability to appropriately handle it.
“What we witnessed in Myanmar and what we’re seeing now in Ethiopia are merely the first chapters of a dreadful novel that no one wants to finish,” Haugen told Congress in October.
Meta also ordered by the Oversight Board to request an independent human rights review of Facebook and Instagram’s involvement in worsening the danger of ethnic conflict in Ethiopia, as well as to determine how well it can censor material in the country’s languages. Meta defended its safety procedures in the nation this month, emphasizing enhanced implementations of several of its anti-misinformation and anti-hate speech regulations.
The business also stated that in the last two years, it has enhanced its enforcement capabilities in the region, and that it can now examine material in Amharic, Oromo, Somali, and Tigrinya, the four most frequent languages.