The Secret Behind Facebook Moderators


The Secret Behind Facebook Moderators

by Laura Spencer

by Laura Spencer

Improvements have and continue to be made around how Facebook is monitoring inappropriate and illegal content on our timelines.

But how have these changes been implemented?

Well, Facebook have hired contractors to monitor images and content that are reported by users as inappropriate. However, employees are now suffering from psychological trauma as they witness disturbing images day-in day-out.

But, what’s worse? Leaving these images for Facebook users to see or taking them away but traumatising those who are left with the job?

The job of moderating Facebooks content used to require contractors to scan through 1000 images per day. This is more than one every 30 seconds over an eight-hour shift. This limit was removed to help with the psychological impacts of viewing so many brutal images every day. However, the requirement is not to view at least 400 to 500 images per eight-hour shift. This still equivalates to one image every minute of the shift.

Not only is the job leaving psychological scars on the contractors, it is leaving some ‘addicted’ to graphic content, leaving some savouring a personal collection of illegal, inappropriate and explicit images for personal pleasure. A job of constantly reading hate speech and fake news has also others shifting far right wing in their opinions.

Algorithms are also being used to flag up potential inappropriate conversations between adults and minors and this is leaving devastating impacts on the contractors viewing this content. A previous contractor said of reading these conversations, “90%” were sexual and were also violating and creepy”. After interviewing a variety of different moderators, it was found that a popular trend of sexual exploitation was rich, white men from Europe and the US, targeting children in countries such as the Philippines for sexual images in exchange for $10 or $20.

Although Facebook is trying to find a resolution for taking down explicit images, is putting contractors at risk by exposing them to these images causing more damage? Maybe Facebook should rely more on algorithms to detect images that are inappropriate for our Facebook feed and to take explicit content down automatically once detected by the algorithms, rather than causing psychological issues for contractors.  The pushback from users would be that it suppresses their freedom of speech and expression. But perhaps if they understood the harm that a manual process was causing others, they may be willing to suffer brief annoyance rather than inflict such harm.

Leave a Reply

Your email address will not be published. Required fields are marked *