Motaung alleges that he was paid as low as $2.20 an hour to view graphic content that left him with PTSD. He describes it as “emotionally and mentally devastating”: “I went in OK and went out not OK,” Motaung said in a statement shared by the Real Facebook Oversight Board, a group of independent civil rights advocates and experts. “It changed the person I was.” Motaung began to push to form a union that would allow the moderators to advocate for better pay and more support for their taxing work. Just six months into the job, he was fired. So he decided to sue his former employer and Meta. Despite Meta’s months-long effort to have it dismissed, on February 6, the Kenyan Employment and Labour Relations Court ruled that Motaung’s case against the social media company can move forward, meaning that Meta can be held accountable for the psychological damage and labor violations faced by Motaung and other outsourced content moderators. Justice Jacob Gakeri ruled that Meta “shall not be struck” from the case, according to Kenyan news site Business Daily, opening the company up to its first substantial labor challenge outside the US. As of 2020, it was estimated that Meta had some 15,000 moderators spread across the world through outsourcing companies. In Kenya, Meta’s outsourcing partner was Sama, though its contract with the company will end in March of this year. Should the case succeed, it could allow other large tech companies that outsource to Kenya to be held accountable for the way staff there are treated, and provide a framework for people in other countries seeking to challenge tech giants. The case, filed by UK-based nonprofit Foxglove Legal and the Kenyan law firm Nzili and Sumbi Advocates on behalf of Motaung, alleges that the working conditions violate Kenyan law and constitute, among other things, forced labor and human trafficking because workers were “coerced by a threat of penalty to accept the unlawful circumstances they found themselves in.” Meta had argued that it should not be subject to Kenyan law because it is a foreign corporation that does not operate in Kenya. Meta and Sama did not respond to a request for comment for this article. “These companies [Meta and other Big Tech firms] seek to enter and profit from a lot of jurisdictions while simultaneously saying that they don’t answer to the courts,” says Cori Crider, director of Foxglove Legal. Motaung’s lawyer, Mercy Mutemi, argues that Meta’s content moderation operations in Nairobi, its small group of staff, and the fact that it makes money from Kenyan advertisers on its platform are proof that the company operates within the country. “They make money from Kenyans,” she says. Meta-owned Facebook had 9.95 million users and Instagram had 2.5 million users in Kenya in 2022. The case is a first from a content moderator outside the company’s home country. In May 2020, Meta (then Facebook) reached a settlement of $52 million with US-based moderators who developed PTSD from working for the company. But previous reporting has found that many of the company’s international moderators doing nearly identical work face lower pay and receive less support while working in countries with fewer mental health care services and labor rights. While US-based moderators made around $15 per hour, moderators in places like India, the Philippines, and Kenya make much less, according to 2019 reporting from the Verge. “The whole point of sending content moderation work overseas and far away is to hold it at arm’s length, and to reduce the cost of this business function,” says Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, who authored a 2020 report on outsourced content moderation. But content moderation is critical for platforms to continue to operate, keeping the kind of content that would drive users—and advertisers—away from the platform. “Content moderation is a core vital business function, not something peripheral or an afterthought. But there’s a powerful irony from the fact that the whole arrangement is set up to offload responsibility,” he says. (A summarized version of Barrett’s report was included as evidence in the current case in Kenya on behalf of Motaung.) Barrett says that other outsourcers, like those in the apparel industry, would find it unthinkable today to say that they bear no responsibility for the conditions in which their clothes are manufactured. “I think technology companies, being younger and in some ways more arrogant, think that they can kind of pull this trick off,” he says. A Sama moderator, speaking to WIRED on the condition of anonymity out of concern for retaliation, described needing to review thousands of pieces of content daily, often needing to make a decision about what could and could not stay on the platform in 55 seconds or less. Sometimes that content could be “something graphic, hate speech, bullying, incitement, something sexual,” they say. “You should expect anything.” “This is about the wider complaints about the system of work being inherently harmful, inherently toxic, and exposing people to an unacceptable level of risk,” Crider says. “That system is functionally identical, whether the person is in Mountain View, in Austin, in Warsaw, in Barcelona, in Dublin, or in Nairobi. And so from our perspective, the point is that it’s Facebook designing the system that is a driver of injury and a risk for PTSD for people.” Crider says that in many countries, particularly those that rely on British common law, courts will often look to decisions in other, similar nations to help frame their own, and that Motaung’s case could be a blueprint for outsourced moderators in other countries. “While it doesn’t set any formal precedent, I hope that this case could set a landmark for other jurisdictions considering how to grapple with these large multinationals.” Updated 2/6/2023 10:00 ET: This piece has been updated to reflect the decision of the court in Kenya to include Meta in Motaung’s ongoing case.