شناسهٔ خبر: 70344097 - سرویس سیاسی
نسخه قابل چاپ منبع: گاردین | لینک خبر

PTSD, depression and anxiety: why former Facebook moderators in Kenya are taking legal action

Ex-staff at outsourcing company Samasource claim they vetted unspeakably graphic videos in harsh conditions

صاحب‌خبر -

Smashing bricks against the side of your house is not a normal way to wind down after work. Nor is biting your own arm or being scared to go to sleep. But that was the reality for one of the 144 people diagnosed with severe post-traumatic stress disorder after moderating horrific content for Facebook.

The young mother in her 20s did the job in Nairobi for more than two years during which she claims she had to vet unspeakably graphic videos. These included extreme sexual deviancy and bestiality, child abuse, torture, dismemberment and murder, which caused her to vomit, according to court filings. On one occasion she recalled having to watch a person having sex with a snake.

The woman was one of hundreds of young Africans, many from Kenya, who worked from 2019 to 2023 at an outsourcing company, Samasource, used by Facebook’s owner Meta to protect its hundreds of millions of users from the worst of the torrent of images and video being uploaded to the social media platform every minute.

According to a compensation claim filed in Kenyan courts by 185 of the moderators, they toiled through day and night shifts in a facility with glaring bright lights, chilly air conditioning, and uncomfortable seats and screens at full brightness. There was close monitoring of performance levels that could lead to contract termination if they dipped. They typically had a minute to evaluate each piece of content – but it would be seared into their minds’ eye for much longer.

From the outside, the Samasource offices look like a regular corporate workplace. In the front of the building, which sits in a bustling business park by a busy road, there is a fountain and a poster that reads “the soul of AI”. It would be a rare passerby who would suspect the darkness that coursed through the screens within.

A woman in her 30s told expert witness psychiatrists that she worked on one video that showed a man being dismembered limb from limb until he died. She cried and walked out of her workstation to compose herself, but the team leader followed her and asked her to get back to work. Other content she had to handle included videos of summary executions during Ethiopia’s civil war. She developed migraines, flashbacks and nightmares. While she used to be “pleasant and happy”, now “she is sad and cries most of the time even without a trigger, she is easily upset and tends to argue with her husband all the time”, a psychiatric assessment said.

And then there was the moderator who was diagnosed with trypophobia, the fear of seeing a pattern of holes such as in a honeycomb, possibly as a result of repeatedly seeing video of maggots crawling out of holes in a decomposing human hand during her three years moderating Facebook content. These images would pop up frequently and cause her to hyperventilate, scream and cry.

People walk outside the building that houses Samasource in Nairobi, Kenya. Photograph: Brian Otieno/The Guardian

Moderators said their managers were “positioned all over the floor”, monitoring aspects of their work such as speed and accuracy on tasks, and time spent on breaks.

Meta has declined to comment on the specific claims while litigation is ongoing but said it required outsourcing firms to offer counselling and healthcare, pay above local industry standards and said it provided technical solutions to limit exposure to graphic materials such as blurring and an opt-out of the autoplay function whereby videos or pictures are played in a nonstop stream. Samasource declined to comment.

The moderators’ diagnoses of PTSD, depression and generalised anxiety disorder caused by their work as Facebook moderators are all part of the hidden human cost of policing the internet.

It is common to assume that AI now takes care of content moderation, but that is not yet the case. Some moderators are still tasked with vetting a torrent of vile material flagged as a concern on social media platforms while others are training the AI that may one day take over.

Despite AI advances, Facebook has a 40,000-strong army of content moderators. Labelling images and text as disturbing or traumatic is one of the costs of delivering generative AI to consumers as the systems need to be taught how to moderate. It is one of the reasons, alongside the need for huge amounts of water to cool datacentres andthe power demands to run them, that AI is described by some observers as an “extractive industry”.

In the case of Facebook, the potential risks to moderators were to some extent known before the 185 workers in Kenya bringing their legal action were tasked with vetting grisly content. In September 2018, a US-based moderator, Selina Scola, and others sued Facebook and in 2020 won a $52m (£41m) payout in compensation for her and the more than 10,000 former and current content moderators across the US for their exposure to graphic content. In 2019, Chris Gray, a former Facebook content moderator in Dublin, began a similar legal action against the platform and the outsourcing company he worked through. His case is yet to reach trial.

But the Kenyan workers were in harness during the pandemic, working alone in hotel rooms away from family, and back in the facility up to 2023.

Meta and Samasource provided wellness counsellors, but they were not medical doctors or able to provide psychiatric support or trained to deal with people facing mental health crises, the claim alleges. Requests for psychiatric care under employee medical insurance were “ignored”, yet Meta and Sama “were aware of the mental health harm and the symptoms the petitioners were grappling with”, the claim asserts.

As many of the moderators were young, the sexual violence against children and adults they had to review was sometimes their first exposure to sexual content. Some became desensitised to sexual urges and were repulsed by their own sexual partners; others became addicted to sex and pornography.

Meanwhile, Ethiopians tasked with handling content from the war in their home country were targeted by one of the warring parties, and were threatened that if they continued deleting posts from that group they would be harmed. One moderator had to spend months in a safe house. Religious moderators felt shame at having to watch sexual content.

After one moderator, Daniel Motaung, launched a legal action, other workers were told associating with him would lead to dismissals, surveillance increased and dismissals increased. Then, in January, there were mass redundancies, although this was paused by the courts in Kenya. The moderators bringing the current legal action believe the mass redundancies were retaliation for Motaung’s case.

The grounds for the case against Meta and Samasource include Kenyan laws against forced labour, human trafficking and modern slavery, unfair labour practices, intentional infliction of mental health harm, curtailing the right to access justice, employment discrimination and unlawful redundancy.