National

‘Bestiality, beheadings, suicide:’ Graphic content gave Facebook moderator PTSD, lawsuit says

The Facebook “like” symbol is illuminated on a sign outside the company’s headquarters in Menlo Park, California. A Facebook contractor who moderated graphic, violent posts says her job gave her post-traumatic stress disorder and that Facebook did little to protect her, according to a lawsuit filed in San Mateo.
The Facebook “like” symbol is illuminated on a sign outside the company’s headquarters in Menlo Park, California. A Facebook contractor who moderated graphic, violent posts says her job gave her post-traumatic stress disorder and that Facebook did little to protect her, according to a lawsuit filed in San Mateo. AP

There’s nothing inherently violent about a content moderator’s workspace at Facebook. But what they witness on their computer screens is a different story, according to a lawsuit filed in California against the social media giant last week.

Inside Selena Scola’s cubicle in California’s Silicon Valley, she watched thousands of acts of depravity and violence to weed graphic content out of Facebook users’ feeds, leaving her with “significant psychological trauma and post-traumatic stress disorder,” according to the lawsuit in which Scola is the lead plaintiff. The complaint alleges the company did little to mitigate the impacts of the disturbing content.

Going into a cold building or touching a computer mouse can now trigger Scola’s post-traumatic stress, the lawsuit said. Loud noises and violence on TV can bring about her symptoms, too.

Scola is not alone in exposure to that graphic content: Thousands of other Facebook content moderators (many of them contractors like Scola) witness similar videos, livestreams and images for hours a day as they work to eliminate graphic content posted to the platform — including “child abuse, rape, torture, bestiality, beheadings, suicide, murder, and other forms of extreme violence,” the lawsuit said.

“Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job,” Korey Nelson, an attorney involved in the lawsuit, said in a statement.

Facebook told McClatchy that it is reviewing the lawsuit, which is seeking class-action status.

“We recognize that this work can often be difficult,” Bertie Thomson, director of corporate communications at Facebook, said in a statement. “That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”

Read Next

Starting in 2006 Facebook worked with other tech companies to create a set of workplace standards to protect content moderators from the impacts of encountering child pornography, according to the lawsuit. Those standards include compulsory counseling, tools to alter graphic content and training to know the signs of post-traumatic stress.

“It is well-documented that repeated exposure to such images can have a profoundly negative effect on the viewer,” Nelson said.

But while other tech companies put those standards into practice, Facebook “ignores” those standards and “affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma,” the lawsuit said.

The lawsuit said in a footnote that it didn’t contain specific details of Scola’s time at Facebook because Scola “fears that Facebook may retaliate against her using a purported non-disclosure agreement” if she reveals too much information.

A moderator who spoke anonymously to The Guardian (and who is cited in the lawsuit) said he began his job after a two-week training course and was paid $15 an hour to weed out terrorist content that had been posted on the platform, according to the British newspaper.

“There was literally nothing enjoyable about the job,” the moderator said, according to The Guardian. “You’d go into work at 9 a.m. every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”

Filed in a California state court in San Mateo County last Thursday, the lawsuit accuses Facebook of negligence and failure to maintain a safe workplace. The lawsuit asks that the tech company create a medical monitoring fund to test and treat moderators who develop post-traumatic stress.

“Facebook has an obligation to provide its content moderators with a safe workplace,” William Most, another lawyer in the case, said in a statement. “Other tech companies implement safeguards to mitigate trauma to content moderators. Facebook must be held to the same standard.”

Scola, a San Francisco resident, worked for nine months at Facebook starting in June 2017, the lawsuit said. She was employed as a contractor through Pro Unlimited, a Florida-based staffing company. Pro Unlimited is also listed as a defendant.

Scola has been formally diagnosed with PTSD, including symptoms such as social anxiety, fatigue and insomnia, the lawsuit said.

Thomson, the Facebook spokesman, said the social media giant requires “companies that we partner with for content review to provide resources and psychological support, including on-site counseling — available at the location where the plaintiff worked — and other wellness resources like relaxation areas at many of our larger facilities.”

The Senate Judiciary and Commerce committees questioned Facebook CEO Mark Zuckerberg during a hearing on April 10. Zuckerberg testified about Facebook’s handling of user data and privacy.

Related stories from The Olympian

  Comments