Tuesday, September 25, 2018

Facebook Content Moderator Developed PTSD, Suit Claims

It is no secret that Facebook users upload thousands of videos and images each year that are filtered out, based on obscenity and decency guidelines. Just like there's no tooth fairy, there's no content fairy.

Almost all of Facebook's uploaded content is screened by contracted content moderator workers, and some of them have been emotionally scarred by what they have been forced to view. As a result, one content moderator, Selena Scola, has filed a lawsuit claiming she is suffering from psychological trauma and post-traumatic stress disorder because she was not properly protected by Facebook, as promised in its corporate guidelines.

Facebook Failed to Maintain a Safe Workplace

In the lawsuit, Scola claims that during her nine months working as a contracted content moderator, she was bombarded with "videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder." Scola is suing Facebook and its contracting company, Pro Unlimited, for negligence and failure to maintain a safe workplace. The law firm representing Scola hopes to turn this class into a class action, pending judicial approval.

Self-Censorship Has High Emotional Cost to Moderators

Facebook and other internet service providers volunteered to self-censor uploaded content back in 2008. They also created standards for training, counseling and supporting these content moderators. Most other internet service providers have done an adequate job, according to the lawsuit, so Facebook should be able to as well, but hasn't even followed the guidelines it helped to create.

Like Facebook, some internet service providers have had issues with PTSD and content moderators. Last year, two Microsoft content moderators sued over the same allegations. According to the lawyer in that case, "It's bad enough just to see a child get sexually molested. Then there are murders. Unspeakable things are done to these children." The plaintiffs in that case had been moderators for years, and their condition deteriorated to the point that they couldn't look at their own children, or even be out in public without suffering panic attacks. That lawsuit is still being debated.

But perhaps Facebook has taken on more than it can bear. With over 2 billion uploads in over 100 languages every day, there is a lot of content to manage, even if pre-screened by artificial intelligence and algorithms. Since 2017, the company has increased its stable of moderators form 3,000 to 7,500, but not only are these moderators still overburdened, but they are also exposed to potential psychological problems. And according to some experts, using contractors, as opposed to employees, heightens the potential that Facebook is unaware of the harm it is imposing on the contractors until it is too late. If these roles are outsourced to contractors, it's going to be even harder for Facebook to monitor moderators' working conditions and mental health.

If you or someone you love has suffered from exposure to graphic content as a content moderator, contact a local personal injury attorney, who can review the facts of your case, and help you decide on a legal strategy. You may be entitled to compensation, and may not need to pay any attorney fees to find out.

Related Resources:



from Injured http://blogs.findlaw.com/injured/2018/09/facebook-content-moderator-developed-ptsd-suit-claims.html

No comments:

Post a Comment