The Taxing Responsibilities of Facebook’s Content Moderators

Global Engagements Fellowship

Recently, I was scrolling though Twitter when I came across a thread of tweets from journalist Alex Heath about Facebook’s content moderators. A bit graphic, but it highlights the use of social media today as a platform to spread harmful or unlawful material:

Facebook has been making news headlines often for its role in affairs, notably its role in the United States’ 2016 presidential election where Russian affiliates influenced voters through widespread propaganda. Now, Facebook is back in the news for its role in the current Myanmar crisis where U.N human rights experts cite Facebook as the leading platform being used to spread hate speech aimed at Rohingya Muslims. The attack on the group is now being regarded as a genocide.

Basically, Facebook is not doing so hot right now. Facebook currently has 4,500 ‘content moderators’ employed to review flagged content, with recent announcements to hire about 3,000 more. Employers are given two weeks of training before they are released to their jobs that require viewing content that often includes extreme graphic violence. The jobs take such an emotional and psychological toll on the moderators that Facebook offers several avenues for therapy to their employers, yet most still leave the job within a year or so.

Beyond the extremes of human depravity, moderators also review and decide what to do with content that deals with controversial issues. Hate speech and the infamous fake news posts are among these, making boundaries of free speech a topic of debate. A slide from Facebook’s guidance for moderators reads:

We aim to allow as much speech as possible but draw the line at content that could credibly cause real world harm. People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways. We aim to disrupt potential real-world harm caused from people inciting or coordinating harm to other people or property by requiring certain details to be present in order to consider the threat credible. In our experience, it’s detail that helps establish that a threat is more likely to occur.

Social media users are growing exponentially. We connect with old friends and family members, it allows us to share experiences and memories with loved ones, and it is often our main source of news. The large platforms provide many benefits, but with its potential for attracting an audience, it has developed into a striking issue as well. It is interesting to follow how we use it to our advantage, which I think says a lot about human nature.

Leave a Reply

Your email address will not be published. Required fields are marked *