Facebook is vowing to improve its system for reviewing and screening out violent and disturbing content after a report said some of its moderators are low-paid contractors who develop PTSD-like symptoms from the grueling work.
The social media giant said Monday that it will implement "a rigorous and regular compliance and audit process" for the contractors who employee people to moderate Facebook content — including posts showing murder, torture, graphic sex and bestiality. Facebook's vice president of global operations, Justin Osofsky, said the company would also be standardizing its contracts for these jobs and would hold "more regular and comprehensive focus groups with vendor employees."
Osoksky posted the announcement on Facebook's news page after The Verge published an investigation into the traumas experienced by some moderators at a facility in Phoenix. The moderators, who are employed by a contractor called Cognizant, said they spent their days sorting through violent, disturbing and inflammatory content while making just $28,800 a year.
The contractors reported seeing coworkers smoking marijuana on the job or having sex in stairwells for release from the misery of the work. Some employees said they developed severe anxiety that lingered long after they left the jobs. Facebook's statement on Monday did not acknowledge any of these specific allegations.
This isn't the first time Facebook's moderators said they were scarred by what they saw on the job. CNET reports a lawsuit filed in a California state superior court last year claims moderators suffered from post-traumatic stress disorder after viewing thousands of posts that include "child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder." The suit was filed on behalf of a moderator who was employed by Pro Unlimited, another contractor used by Facebook. Facebook said in a court filing that the employee did not have a right to sue because she was an independent contractor. The case is pending.
Facebook uses several contracting firms to hire, train and supervise moderators, and said it wants to hear more from them about how the process can be improved. "However, given the size at which we operate and how quickly we've grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis," Osofsky said.