A key moderation partner for Meta is shutting down its African hub, which handled some of the most harmful content on Facebook

Advertisement
A key moderation partner for Meta is shutting down its African hub, which handled some of the most harmful content on Facebook
Meta's main content moderation partner in east Africa, responsible for reviewing harmful content, is discontinuing its work for the tech giant.Arnd Wiegmann/Reuters
  • A third party contractor for Meta in east Africa is shutting down its content moderation arm.
  • Sama's Kenya office was responsible for reviewing some of Meta's most harmful content.
Advertisement

Meta's key content moderation partner in East Africa, which is responsible for policing harmful content like child abuse and beheadings, is discontinuing its work for the tech giant.

Sama — a third party contractor for Meta — is shutting down parts of its business including content moderation for the social media giant from March 2023, it announced on Tuesday.

It said it is letting go of around 3% of its staff — roughly 200 employees — at its Nairobi office and has had to make the decision because of the "current economic climate."

Complimentary Tech Event
Transform talent with learning that works
Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

The firm said it is offering support packages for employees affected and encouraging those eligible to apply for work at its other offices in Kenya and Uganda. Staff who have been let go will receive severance packages as well as wellbeing support for up to 12 months after their last day of working at the company.

"We respect Sama's decision to exit the content review services it provides to social media platforms," a Meta spokesperson said in a statement to Insider. "We'll work with our partners during this transition to ensure there's no impact on our ability to review content."

Advertisement

Meta first contracted Sama in 2017 for data labelling and training artificial intelligence, hiring some 1,500 employees, the Financial Times reported.

The company has faced allegations of poor treatment of employees, and in 2022 a lawsuit was filed by a fired content moderator against Meta and Sama alleging human trafficking, forced labor, and union busting.

The former employee, Daniel Motaung, said he and his colleagues at Sama were exposed to graphic content including child sexual abuse, animal torture, and beheadings and were paid a salary of $2.20 per-hour. Some of his peers developed PTSD as a result of the experience, the lawsuit alleged.

Majorel — a Luxembourg-based firm — will be taking over content moderation for Meta in the region, sources with knowledge of the matter told the FT. Majorel has worked on content moderation for TikTok previously.

Majorel has also faced allegations from employees of poor working conditions. In August 2022, Insider reported on current and former TikTok moderators employed through Majorel who claimed they experienced severe psychological distress. Some workers also alleged that Majorel created an environment of near-constant surveillance and unrealistic metric goals.

Advertisement
{{}}