The Horrifying Truth of Facebook Moderators

Almost on Facebook is now aware of the “Report for Abuse” button for when you see something disturbing posted. From naked pictures to inappropriate content, the button allows users to alert Facebook employees that there is something on Facebook that shouldn’t be seen. Who takes care of this? I think many of us just assume that a Facebook employee or team is in charge of deleting inappropriate content. “Pretty much any social media site you can think of uses some sort of moderation to keep abusive content off its page.”

What’s interesting is that many people don’t know how disturbing content on the internet can get. Most people think of videos that involve violence or nudity, but there’s everything from child pornography, beheading’s and brutal violence that is posted on social media sites. This sort of content takes a toll on content-moderating workers – of whom there are an estimated 100,000 worldwide.

In the November issue of Wired, Adrian Chen offers a peek into one of the darkest aspects to the social media industry. A scary fact is that the average length of employment for content moderators is between three and six months. Most of them don’t even work for that long, and end up quitting much sooner.

There is a lot of negative content online and for an estimated 100,00 moderators worldwide, you would think most of the content would be strictly regulated. However, one of the best things of the web is the freedom to post whatever you want under whatever account you like. That causes some positive and most definitely, some negative aspects, to social media.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s