More than 1,100 Kenyan content moderators are set to lose their jobs after Meta terminated a key outsourcing contract with Sama, the US-headquartered firm that has operated one of the largest moderation hubs for Facebook in sub-Saharan Africa.
Sama confirmed the looming layoffs in a statement, saying Meta had opted not to renew the agreement under which moderators in Nairobi reviewed and removed violent, hateful and otherwise harmful material from Facebook and other Meta platforms. The decision follows years of mounting criticism, legal challenges and public scrutiny over working conditions at the moderation centre.
Since 2019, Sama’s Nairobi office has served as a frontline filter for some of the most disturbing content posted online, including graphic violence, child abuse and extremist propaganda. Former employees say the work came at a steep human cost, with staff repeatedly exposed to traumatic imagery for relatively low pay.
In 2023, nearly 200 moderators who had been dismissed by Sama filed a lawsuit in Kenya alleging unfair termination and what they described as inhumane working conditions. Court filings and public statements from workers accused the company of forced labour practices, irregular pay and inadequate psychological support, despite the extreme nature of the material they were required to view.
A separate complaint lodged in 2022 by a former South African moderator added to the pressure, painting a similar picture of mental health strain and insufficient safeguards. Workers involved in the cases say they suffer from anxiety, depression and symptoms consistent with post-traumatic stress disorder, and are seeking compensation and stronger protections.
Sama has consistently rejected allegations of abuse, insisting that it pays a living wage in Kenya, offers full benefits and provides access to professional counselling. The company has portrayed itself as a socially responsible employer that helped bring thousands of digital jobs to East Africa.
Meta, for its part, said it ended the contract because Sama no longer met its operational standards. The company indicated that it is increasingly turning to artificial intelligence and machine learning models to detect and remove harmful content, while maintaining that human moderators will still play a role in its global safety operations.
The Nairobi layoffs highlight the human toll behind social media’s promise of safe online spaces, and raise fresh questions about how global tech giants treat the workers who police the darkest corners of the interneth.