Meta, Facebook’s parent company, faces serious allegations of negligence after its contractor, Sama, dismissed threats made by Ethiopian rebels against content moderators, according to court documents filed on December 4, 2024. The allegations, part of a broader legal battle in Kenya, reveal troubling details about the treatment of moderators tasked with reviewing graphic content, particularly from Ethiopia a nation grappling with intense political and ethnic violence.
The case originates from a 2023 lawsuit where 185 content moderators sued Meta and its contractors, Sama and Majorel, for alleged wrongful dismissal and blacklisting. The plaintiffs, who worked for Sama in Kenya, claimed they were let go after attempting to form a union. Meta later switched contracts from Sama to Majorel, effectively barring the dismissed moderators from reapplying for similar positions.
Among the moderators are individuals who focused on content from Ethiopia and reported being targeted by members of the Oromo Liberation Army (OLA), a rebel group accused of atrocities in the Oromiya region. According to the moderators, they faced direct threats for removing OLA content from Facebook, including graphic videos that violated Meta’s guidelines.
One moderator recounted receiving a message from OLA warning him and his colleagues to stop removing their posts or face “dire consequences.” Another revealed that he received a list of names and addresses, including his own, in a threatening message allegedly from OLA. “Since I received that threatening message, I have lived in so much fear of even visiting my family members in Ethiopia,” he stated in an affidavit.
Sama, accused in court documents of initially dismissing these threats as fabricated, eventually placed one of the targeted moderators in a safehouse. However, the incident raised questions about the company’s duty of care for its employees in high-risk roles. Sama declined to comment on the allegations, while Meta and the OLA also remained silent.
The OLA, an outlawed splinter faction of a formerly banned opposition group, has been linked to numerous attacks in Ethiopia’s Oromiya region, including the killing of civilians. The rebel group’s grievances stem from longstanding claims of marginalization of Ethiopia’s Oromo community, Africa’s largest ethnic group.
Hate Speech Moderation Challenges in Ethiopia
The case against Sama and Meta goes beyond threats to moderators, highlighting a broader failure to address hate speech on Facebook during Ethiopia’s ongoing conflicts. Moderators involved in the lawsuit allege that they were forced to repeatedly review hateful and violent content that they were unable to remove due to Meta’s policies.
One supervisor stated in her affidavit that she felt “stuck in an endless loop” of moderating harmful content that Meta deemed permissible under its technical guidelines. Experts hired by Meta to curb hate speech in Ethiopia reportedly offered recommendations, which were ignored, further exacerbating the problem.
This isn’t the first time Meta has faced criticism for its role in Ethiopia’s civil conflicts. In a separate 2022 lawsuit filed in Kenya, the company was accused of enabling the proliferation of violent and hateful posts on its platform, which allegedly fueled tensions during the war between Ethiopia’s federal government and Tigrayan regional forces.
Global Implications for Content Moderation
The Kenyan case against Meta and its contractors carries significant implications for the tech giant’s global operations. Meta relies on content moderators worldwide to review harmful posts, a role often described as emotionally taxing and fraught with risks, especially in conflict zones. The lawsuit underscores the precarious conditions faced by moderators, who are often hired through third-party firms with limited oversight or protections.
Out-of-court settlement talks between Meta and the moderators collapsed in October 2023, setting the stage for a prolonged legal battle. If the moderators prevail, it could force Meta to rethink its relationship with contractors and implement stronger safeguards for its global workforce.
As the lawsuit progresses, the case sheds light on the ethical and operational challenges of moderating online content in conflict zones. It also raises pressing questions about the responsibility of tech companies like Meta to protect not only their users but also the workers tasked with ensuring the safety of their platforms.