The people policing the internet’s most horrific content

0
6
The people policing the internet’s most horrific content

Shawn SpeagleImage copyright
Shawn Speagle

Image caption

Shawn is still trying to process what he had to watch as a content moderator

In this digital self-publishing era people can record and produce their own content, a lot of horrific stuff that clearly breaches websites’ taste and decency guidelines. A growing army of moderators has the unenviable task of sifting through it all, sometimes at considerable cost to their mental health.

WARNING: article contains upsetting content.

Shawn Speagle worked as an online content moderator for six months in 2018. He’s still scarred by the experience.

“One of my first videos that I remember looking at was two teenagers grabbing an iguana by the tail and they smashed it onto the pavement while a third person was recording it.

“And the iguana was screaming and the kids just would not stop until the iguana was just pasted on the ground.”

Shawn was employed by a company called Cognizant in Florida which had a contract with Facebook. He speaks in a slow, considered way, still trying to process what he had to go through.

Image copyright
PA Media

Image caption

Facebook uses around 30,000 sub-contracted content moderators around the world

“I’ve seen people put fireworks in a dog’s mouth and duct tape it shut. I’ve seen cannibalism videos, I’ve seen terrorism propaganda videos,” he continues.

Hearing Shawn speak, it becomes clear why moderation has often been described as the worst job in tech.

Most of us internet users probably never give these moderators a second thought, yet there are hundreds of thousands of them around the world helping companies weed out disturbing content – ranging from suicide and murder videos to conspiracy theories and hate speech.

And now some are coming out of the shadows to tell their stories.

Media playback is unsupported on your device

Media captionWATCH: Film explores social media moderation

Shawn decided to speak out, despite having signed a non-disclosure agreement (NDA) – a standard practice in the industry.

These NDAs are also meant to prevent contractors from sharing Facebook users’ personal information with the outside world, at a time of intense scrutiny over data privacy.

But Shawn believes Facebook moderation policies should be talked about openly, because staff end up watching upsetting content that is often left untouched on the platform.

As an animal lover, he was distraught that animal content “was, for the most part, never accelerated in any way shape or form”, meaning that it was never referred for removal.

For humans the rules were a little bit different, but also more convoluted.

More Technology of Business

The most common outcome was marking it as “disturbing” and leaving it on the platform. Shawn tells the BBC that, according to Facebook policy, seeing bodily innards, not in a medical setting, would result in the video being deleted.

He struggles to recollect any other examples that would result in content removal.

The stress of the job led to overeating and weight gain

Read More

Leave a reply