“Facebook needs to work on creating an improved alliance with outsourced companies or bring the work in-house, ensuring compliance for a better and safer world.”


Dr Rosalind Jones writes for the Birmingham Business School Blog:


In a recent Channel 4 Dispatches program – Inside Facebook: Secrets of the Social Network, an undercover reporter gathered footage as a trainee content moderator at Facebook’s largest centre in the UK.

Facebook has hit a new low: this program illuminated some serious data management and protection issues. Through poor content moderation, Facebook allows for the protection of those carrying out criminal activity, of those pursuing a far-right politics which allows for hate speech and racist content, and those abusing children.

Shocking content remains on Facebook for years, despite being flagged up by users as inappropriate. As long as the message alongside the video/image isn’t condoning the content, it is left alone by trained moderators. These posts can remain on the platform for a prolonged period of time, with footage of one child abuse victim still being shown six years later. During the program, an NSPCC official viewed the footage and was visibly upset. He observed that while these videos remained on Facebook, victims were still continuing to suffer from the abuse.

Venture capitalist Roger McNamee, a mentor to Mark Zuckerberg and an early investor in Facebook, spoke out against the practices and said the site’s business model relies on extreme content to make money from online advertising.

These facts are shocking, but I was even more aghast at the three main issues that I saw:

Essential moderation work is outsourced to another company, in this case CPL Resources in Dublin, who have worked with Facebook since 2010. By now, Facebook should be alert to the dangers of outsourcing to any company for such important components of their business. It is very difficult to ensure adequate employee resources and appropriate training. In this case, these issues have caused long delays in even the most urgent cases.

There appears to be no policy or regulation between governments and other social media networks to clarify what is “free speech” and what should be moderated and referred to authorities. Social media platforms and governments need to swiftly engage in debate, develop policies, and produce a clear set of guidelines to ensure safety for consumers.

Worrying views were expressed by content moderators who revealed that their role was not to regulate or control content and that serious cases, such as when someone is in immediate danger, were not always referred to the appropriate authorities. This issue enforces the need for Facebook and other platforms to reassure consumers that they have procedures in place to ensure protection of the most vulnerable, and that data is managed appropriately by well-trained employees.

What occurs online is impacting our society offline. There is plenty of research evidence to suggest that the more we consume violent images, the more we are dulled to their shock value, feeding our appetite for more grisly viewing. As market leaders, Facebook have an opportunity to lead the way and be a responsible business. In addressing the issues of online moderation, Facebook needs to work on creating an improved alliance with outsourced companies or bring the work in-house, ensuring compliance for a better and safer world.