The Human Effect of Moderating the Internet

The Human Effect of Moderating the Internet

When something goes wrong on the internet that is the only time moderation is discussed and or criticized. Earlier this year on Easter Day Steve Stephens went around driving and got out of his car to shoot and kill a 74 year old man even though they had never met before. He recorded the entire assault on his phone after which he posted it on Facebook. In April, in Thailand, a man killed his eleven year old daughter of a rooftop in a deserted hotel before he killed himself too and live streamed this on Facebook. Only six days later, Naika Venant, a 14 year old, who lived in a Foster home, tied a scarf to a shower’s glass doorframe and hung herself. She streamed the whole suicide on Facebook Live, in real time. In a similar manner, in the next month a Georgia teenager took pills and placed a bag over her head to suffocate herself, all the while streaming the entire attempt on Facebook Live. She only survived because people watching the livestream called local police and they arrived just in time to save her life.

These events are the extreme, headlining news for you average person. Truly these are terrible events that should have never reached to amount of people it got and should have been taken care of before anyone saw it or was affected by it.

That is the job of an estimated 150,000, just to look at these sort of events and to remove them, deal with them in a correct manner. Sometimes accidents happen and things slip past. These are you content moderators.

Content Moderation

Content Moderation is the practice of carefully monitoring of content on line to see whether or not the content lines up with the terms of use for the Internet service providers and social networking websites such as Google, Facebook, Twitter and etc. The job is performed by human moderators, though aided by algorithms and AI, moderators remove any offensive material that crosses their path.

This requires them to review any content that has been reported or circulated before they make the decision to delete or keep it. The moderation of the internet is perhaps the silent guardian of the internet, keeping to the back of the internet but it’s existence serves critical importance to the interaction of the average user, making sure they have a safe and clean experience whenever they’re online. The average person or user is therefore unaware of the amount of effort and human labor it goes behind making sure their own experience goes smooth while browsing the internet.

Think of any distressing video or content you have watched and think of how it made you felt. Now imagine having a job where you have to see distressing, offensive and downright wrong material to get rid of it.

The Content That We Don’t Get to See

When you think of content moderation, you might brush it off as community moderators do, removing or approving posts like those in Facebook groups or even Reddit communities. And you’re not wrong to think that either. But it goes a little further than say, a risqué Halloween picture of a bunch of friends or pictures of parties where it drinking or the use of drugs is depicted. This type of content, while it may be offensive to some, is pretty harmless when it comes to what moderators really have to deal with. The other content the moderators have to deal with daily ranges from child pornography, rape, torture to bestiality, beheading, gore and videos of extreme violence.

While this type of content maybe in the minority, think of how much content is posted or shared every day. According to Brand watch, a social media intelligence firm, about 3.2 billion pictures are shared every single day. Every day on Youtube, 300 hours of videos are posted every minute. 6000 tweets a second are posted which amounts to 500 million tweets a day. So if two percent of the images are inappropriate, that means 64 million posts violate terms of service agreement alone.

“There was literally nothing enjoyable about the job. You’d go into work at 9 a.m. every morning, turn on your computer and watch someone have their head cut off,” one Facebook moderator told the Guardian earlier this year. “Every day, every minute, that’s what you see. Heads being cut off.”

The Story of One Such Content Moderator

Such is the life of any moderator. Henry Soto moved from Texas to Washington in 2005 along with his wife Sara so that Henry could take on a job at Microsoft. Ten years later, Henry has problems spending time with his son, not because his child is unruly or he is too distant from him, but because just by looking at his child, terrible images would be triggered in Soto’s head. This was reportedly because of the images he’d seen while watching, reviewing and moderating content for Microsoft, much of which was child rape and murder. In 2016, Henry Soto and Greg Blauret(another content moderator) filed a lawsuit against Microsoft. The complaint for damages filed by their lawyers detail a horrifying image which in fact is reality where Soto and Blauret spent hours each day filtering out graphic and obscene content without adequate psychological support.

As the lawsuit details, Soto worked in customer support in its respective department where he manages call center operations and worked to repair problems related to when he was involuntarily shifted to Online Safety department in 2008. The complaint further detail that “neither Soto not Blauret were warned about the likely dangerous impact of reviewing the depictions not where they warned that they may become so concerned with the welfare of the children, they would not appreciate the harm toxic images would cause them and their families”

“He was initially told that he was going to be moderating terms of use, that’s all,” says Ben Wells, the attorney representing Soto in the lawsuit. “And then he started out and quickly learned that this was horrible content. While he became extremely good at what he did, and had good employment reviews, by 2010 he was diagnosed with PTSD.”

Soto finally quit when he saw a video of a young girl who was raped and killed. Even though the internet likes to make fun of ‘triggers’, Wells, Soto’s attorney, describes that triggers turn on the videos in your brain and then you watch them again and again and again. Someone who has viewed enough harmful material might be in a continuous PTSD trigger. You could just be walking along the park or spending time with your children and the images you’ve seen could replay in your head, they could relay into real life. Soto was effected by the PTSD the most. The issued affected his family as well. He sometimes can’t be around his family, especially his son. It is hard for him to be around computers and children so much so that his son can’t have his friends over.

And even this is the tip of the iceberg for what Soto really goes through.

The Expert(s) Weigh In

Sarah T. Roberts, an associate professor of information studies at UCLA and author of “Behind the Screen: Digitally Laboring in Social Media’s Shadow World” is one of the handful of people who actually study the content filtering large companies partake in. Some US companies outsource content moderation to countries like Philippines, where Roberts has conducted research for her book. Along with her research in the US, she believes that Soto’s case is the first in its kind where an in house employee has brought a suit against a prominent ISP.

Roberts back up Soto’s claims regarding his PTSD and triggering from offensive images and she says that it meets with what she has encountered throughout her research. “The way that people have put it to me over the years is that everybody has ‘the thing’ they can’t deal with,” she says “Everyone has ‘the thing’ that takes them to a bad place or essentially disables them.”

On one occasion in her research, Roberts was told by one of the moderators that they just couldn’t hear any more videos of people screaming in pain anymore. The audio was the hard part that they could not take. Splitting audio and video to make the reviewing content an easier job is a common practice among those involved in moderating, especially in Microsoft. People have started to develop all sorts of strange reactions to things they didn’t even know existed,” Roberts says.

“If this individual who couldn’t handle screaming went and saw a movie where a character was doing just that in the midst of a violent scene, it could trigger a variety of different responses psychologically and physiologically,” says Roberts

Roberts further found that less severe yet still effects of content moderation had taken it’s toll on these people. Some admitted that they had been drinking more or have had problems developing closer relationships because something would suddenly flash in front of their eyes. “I can’t imagine anyone who does this job and is able to just walk out at the end of their shift and just be done” one moderator said to Roberts. “You dwell on it whether you want to or not.

To follow up on this story, we’ll being doing another story on this very soon!


Leave a Reply

Your email address will not be published.