A Barcelona-based company acting as a subcontractor for Meta, which provides content moderation services to Facebook and Instagram, was found liable for psychological harm suffered by a worker, a court in Spain found. The ruling, issued on Thursday, according to local press reports, is the first time a court in Spain has found a content management company responsible for the mental disorders suffered by an employee.
Report in El Periodico He said on Thursday that the ruling, handed down earlier this month, relates to the challenge brought against Meta’s local subcontractor, CCC Barcelona Digital Services, by a 26-year-old Brazilian who has been receiving five years of psychological treatment due to exposure to severe stress. And violent content on Facebook and Instagram, such as murders, suicides, terrorism, and torture.
The worker in question, who began moderating Facebook and Instagram content in 2018, is said to have suffered a range of psychological harms, including panic attacks, avoidance behaviours, excessive worry about suffering from illnesses, disturbed sleep, difficulty swallowing, and an extreme phobia of death ( Anxiety due to fear of death), according to the newspaper report.
The Barcelona Court accepted that the mental problems suffered by the worker are not a common disease, but rather a work accident, according to the newspaper. Meta’s subcontractor treated his absence from work as a common ailment and sought to deny liability for any psychological damages he suffered from reviewing violent content uploaded to Facebook and Instagram.
in Social media sharing In response to the court ruling, the law firm representing the worker, Espacio Jurídico Feliu Fins, described the outcome as a major win for any worker who suffers from mental health problems as a result of the work they do.
“Meta and social media in general must realize the scale of this problem, and must change their strategy,” the law firm wrote in the post. [in Spanish; this is a machine translation]. “Instead of pursuing a strategy of denying the problem, they should accept that this horrific reality, which these workers are experiencing, is as real as life itself.
“The day they face it, on that day, everything will change. As long as this does not happen, we will make sure it happens through the legal system. We will proceed step by step, without haste, but without hesitation. And above all, with complete determination.” But we will win.”
Meta’s outsourcing of toxic content review to numerous third-party subcontractors, which provide dozens of – typically – low-wage workers to be used as human filters for extreme violence and other horrific acts uploaded to social media networks, was A source of concern. Stories for years. However, this practice continues.
Back in May 2020 Meta has agreed to pay $52 million to settle a US class action lawsuit brought by content moderators working for third-party content review services for its social networks who claimed that reviewing violent and graphic images led to them developing post-traumatic stress disorder.
The company is also facing litigation in Africa, where a broker working for Sama, a subcontractor for Meta in Kenya, is suing the two companies over allegations that also include failure to provide “adequate” psychosocial support.
Meta declined to comment on the ruling issued against its subcontractor in Spain. But the social networking giant has provided some public information regarding its approach to outsourcing content moderation, saying that its contracts with the third parties it works with on content moderation contain expectations that they will make provisions in areas including consulting, training and other worker support.
The tech giant also said its contracts require subcontractors to provide 24/7 on-site support with trained practitioners, as well as offering on-call service and access to private healthcare from day one of employment.
Meta also noted that it provides technical solutions to subcontractors aimed at enabling content reviewers to limit their exposure to the graphic material they are asked to moderate as much as possible. This tool can be customized by reviewers so that graphic content appears completely blurred, in black and white, is blurred in the first frame, plays without sound, or is opted out of autoplay, she said.
However, the company’s background notes did not address the possibility that support services and screening tools could be undermined by stringent productivity and performance quotas, which may be imposed on auditors by subcontractors – which could, in practice, make it difficult for these workers to obtain support. Sufficient while continuing to perform at the rates required by employers.
Back in October, the Barcelona-based newspaper said: La Vanguardia, reported that about 20% of CCC Barcelona Digital Services employees were out of work as a result of the psychological trauma caused by reviewing toxic content. In the article, the newspaper quoted one worker who described the support provided by his employer, and a subcontractor to Meta, as “extremely insufficient.”
Another report from the same month, in Nacionaldiscusses the high “success rate” (98%) that workers are told they must deliver—which means that each supervisor’s decisions must match those of their coworkers, And senior auditors are, the vast majority of the time, at risk of being fired if their rate drops, according to the same report.
Obviously, the use of screening tools that completely or partially obscure the content to be reviewed may make it difficult for reviewers to achieve precise performance goals. Therefore, workers may view the use of tools that may reduce the accuracy of their assessments as risky, and see themselves as falling behind their peers, because doing so may jeopardize their continued employment—effectively discouraging them from taking actions that might better protect them from exposure to violence. Psychologically harmful content.
Shift work routinely imposed on CMS workers may contribute to the development of mental health issues, as disturbances in sleep patterns are known to contribute to stress. Additionally, the routine use of young, low-wage workers on content modification farms suggests a high risk of burnout included in the model – suggesting that this is a closed industry formed around managing toxicity via high disturbance; Or, basically, outsourcing combustion as a service.
However, legal provisions that impose requirements on external content reviewers to care for workers’ mental health can place limitations on the model.
A request for comment sent to Telus, the Canadian company that owns CCC Barcelona Digital Services, had not been responded to at press time.