Invisible Risks: the world of content moderators

Unveiling the Unseen: Understanding the World of Content Moderators

Tales of Impact

“Being safe online means understanding the immense benefits and potential that the internet has to our daily lives- balancing it with a critical understanding and awareness of what can go wrong and how others with more malicious intentions seek to be opportunistic and take advantage of our digital independence. It is about a whole family approach. Ensuring and enabling parents, teachers, children and young people to safely and enjoyably use each and every facet connectivity has to offer.” 

-Jeffrey DeMarco, Associate Director, Psychology, Centre for Abuse and Trauma Studies, University of Middlesex 

“…when you are constantly reviewing abuse (especially child abuse), it feels like you become desensitised. You’re shocked and outraged at first. Then you realise this is something you see on repeat every day and it becomes your new normal. We are encouraged to take extra breaks if we come across something that shook us more than other content. But regardless of the break, those images stick with you for a very long time.”

-Content Moderator (name hidden to protect privacy).

Content moderators are the men and women keeping the internet safe for all of us. As individuals tasked with monitoring and managing user-generated content on digital platforms such as social media, forums, websites, and online communities, they are responsible for constantly surveiling and removing traumatic images and videos of child sexual abuse and ensuring that content adheres to the specific guidelines set by the platform, which typically include rules against harmful or inappropriate content.

The research project ‘Invisible Risks: Combating Secondary Trauma to Safeguard Children’, led by the University of Middlesex and funded by the Tech Coalition Safe Online Research Fund, sheds light on the often-overlooked plight of content moderators and aims to understand the impact of constant exposure to traumatic child sexual abuse material on those tasked with the challenging job of content moderation.

The Pioneering Project Timeline

Started in November 2021 and scheduled to conclude in October 2024, Middlesex University, in collaboration with INHOPE and other sector-specific organisations, delve deep into the experiences of content moderators. The project’s aim is to uncover the invisible risks that content moderators face and develop a pilot intervention to safeguard their mental health and well-being.

“I’ve often heard how easily people discredit content reviewers as if it’s not a real job. What they don’t understand is how incredibly mentally draining it is to constantly soak up the worst of our society, reviewing their posts.”

-Content Moderator (name hidden to protect privacy).

The Human Side of Content Moderation: A Glimpse into the Challenges

Content moderation comes with its own set of challenges, including exposure to traumatic content, high accuracy and processing targets, low wages, insufficient counselling, and confidentiality clauses. This combination can lead to psychological distress, burnout, and severe consequences for the moderators. A content moderator shares their experience, “I’ve often heard how easily people discredit content reviewers as if it’s not a real job. What they don’t understand is how incredibly mentally draining it is to constantly soak up the worst of our society, reviewing their posts. How much you absorb while reading people arguing and hating each other, how easy it is to wish people death. And if you review gore media, or any type of abuse, that these images stay imprinted in your brain forever.”

Content moderators play a crucial role in safeguarding the digital world, but little is known about the toll their job takes on their mental health. As Dr. Elena Martellozzo, Associate Professor in Criminology, and Jeffrey DeMarco, Associate Director for the Centre for Abuse and Trauma Studies, highlights: “The constant exposure to disturbing or offensive content can take a toll on a person’s mental health.”

The research also highlights the diverse working conditions of content moderators worldwide, ranging from office settings in developed nations to mall-based workplaces in developing countries. Despite these differences, one constant remains—the job is often poorly paid. Moderators are tasked with reviewing content flagged for potential abuse, facing everything from banal material to highly graphic and disturbing content. Jeffrey DeMarco, reflects on the nature of content moderation: “That often means that you are bombarded with the heaviest, most disturbing content straight after logging in…higher targets often mean less detailed investigation time and poorer quality.”

From Insight to Intervention: Building a Toolkit for Moderators' Well-being

Building on the research, the next step is to develop a pilot intervention—a toolkit consisting of online modules addressing various aspects of content moderation. These modules, developed with input from mental health professionals, aim to provide tailored support for moderators, focusing on areas such as changing negative thoughts.

A content moderator suggests practical measures for improvement: “To support content moderators, companies can provide comprehensive mental health resources, and limit exposure to disturbing content.”

Jeffrey DeMarco shares insights into the project’s progress: “We are striving to create additional resources for commercial content moderators to ensure their well-being and work-life balance while supporting them with a new online psychoeducational resource to help them deal with the daily challenges of viewing online content such as child sexual abuse material.”

Bridging the Gap: Working Together for Impact Across the Ecosystem

Recognizing the integral role of partners, the project collaborates with INHOPE and Internet Watch Foundation, utilising their expertise and network to understand moderators’ experiences. Both organisations provide advisory and gatekeeping roles, ensuring the project reflects the real-world challenges faced by content moderators.

The research delivers actionable insights on the psychological impact of working with child sexual abuse material, applicable across tech companies  and different staff levels. The project strives to identify the needs of content moderators, propose actions for change, and inform recommendations for supporting workers dealing with difficult material. Moreover, the findings may pave the way for a Continued Professional Development (CPD) course for individuals interested in content moderation or child protection.

“To support content moderators, companies can provide comprehensive mental health resources, and limit exposure to disturbing content.”

-Content Moderator

Shaping the Future: AI and Human Moderation in Harmony

While acknowledging the role of Artificial Intelligence (AI) and machine learning in reducing online child sexual exploitation and abuse, the project emphasises the importance of human moderation. Developing better automated systems is crucial, but so is ensuring the well-being of those working in this challenging field.

Invisible Risks: Combating Secondary Trauma to Safeguard Children is not just a research project; it’s a step towards understanding and supporting the unrecognised heroes working to protect the digital safety of our children. Thanks to the support of the Tech Coalition Safe Online Research Fund, Middlesex University is leading the way in creating a safer digital space for all.

Images: ©Safe Online/Vincent Tremeau

Stay in the loop.

Our purpose in detail

We are here to ensure every child and young person grows in to the digital world feeling safe, and is protected from harm.

We support, champion, and invest in innovative partners from the public, private, and third sectors working towards the same objective.

We believe in equipping guardians and young people with the skills to understand and see danger themselves once accessing digital experiences without supervision.

We'd love to have a chat

We're thrilled you're interested in donating to Safe Online - pop in the details below and we will get back to you to set up a discussion.