Search
Close this search box.

From research to real-world impact: How University of Kent helped shape Google’s deterrence messaging

An important example of research translating into action is how insights from the University of Kent’s Research Fund-funded study informed Google’s development of a Child Sexual Abuse Material (CSAM) warning message. Drawing on the project’s psychological findings, the University of Kent contributed to expert consultations and discussions that helped shape the final message — now displayed to users searching for CSAM on Google. This collaboration exemplifies how targeted research can drive impactful, real-world interventions.

When Caoilte Ó Ciardha, a researcher at the University of Kent, set out to study the effectiveness of online deterrence messages, his goal was clear: to generate insights that could shape real-world solutions. Through the Tech Coalition Safe Online Research Fund, which elevated the visibility of his research, his work caught Google’s attention and grew into a collaboration that shaped a tool used by millions worldwide. What began as a research study quickly evolved into a practical partnership – showing how research, when shared clearly and accessibly, can spark meaningful change inside tech companies.

The partnership

Google, through industry-researcher convenings organized by the Research Fund, had noticed his study, which analysed the Google OneBox feature, a search result tool that displays information directly on the results page. For Google, seeing their own tool studied in a rigorous academic context was eye-opening. For Caoilte, it was an unexpected opportunity to put research into practice. Google then invited Caoilte to join an expert panel on online deterrence messaging. 

The panel brought together around 15 experts—researchers, helplines, and Google teams to discuss short- and long-term crime deterrence and prevention, and how to intervene at critical moments of potential harm. From there, the collaboration deepened. Google’s internal teams drafted new deterrence messages and asked panel members, including Caoilte, for feedback. He helped shape the messaging further, offering suggestions on language, tone and evaluation strategies.

That real-world collaboration – researchers informing tech, and tech adapting based on insights – proved powerful. The outcome was a redesigned OneBox deterrence message, launched in 21 countries. Using a difference-in-differences analysis, Google found modest but positive reductions in harmful search behavior. While the final version wasn’t a direct copy of Caoilte’s study, it was undoubtedly shaped by it.

“Early in 2025, we transformed our CSAM deterrence messaging in Google Search to incorporate a public health perspective and to encourage help-seeking by people at risk of perpetrating child sexual exploitation and abuse. The research and expertise of Dr. Ó Ciardha, in collaboration with other global experts, drove many of the evidence-informed updates in our prevention intervention, the language we use, and the helpline services we include. Since making these changes, we have observed greater engagement with the therapeutic help services we feature and a reduction in the rate at which users issue follow-on CSAM-seeking queries in Search. We applaud the Tech Coalition Safe Online Research Fund for supporting Dr. Ó Ciardha’s work — it’s already having demonstrable impact in the way we combat CSAM online.”

Griffin Hunt, Senior Product Manager, Google Search

One of Caoilte’s key contributions was showing, in practical terms, how small changes in language could make a difference. In his study, he created mock-ups of alternative messages and tested them against Google’s version. It was a clear, practical way to present findings back to the company, and proved useful in influencing messaging design. Key insights from this work show that messages warning about legal consequences are perceived to discourage people from continuing harmful searches, while messages that focus on support and recovery are perceived to make people more likely to seek help.

Listen to Caoilte from the University of Kent introducing the project:

The learnings

Reflecting on the process, Caoilte emphasized that effective collaboration requires more than good data—it requires humility and curiosity. Even if you’re the expert in research, you need to acknowledge the other party knows their systems, constraints, and context in ways an external researcher cannot,” he shared. The key, he said, is to frame recommendations not as prescriptions, but as explorations: I wanted to see if any of my expertise could inform what you’re doing.”

He also stressed the importance of building trust. This nuance is “easier to do in a conversation than static emails,” which is why Caoilte highlighted the relevance of convenings—they build the trust and shared understanding that emails can’t. The Research Fund’s convening opportunities, whether in-person or virtual, are intentionally structured to spark open, two-way dialogue, bringing people together in a spirit of curiosity and collaboration to explore how best to work collectively.

The process is not without its challenges, said Caoilte, sharing the risks and nuances that other research grantees should be aware of when collaborating with tech companies.

Not every investment pays off. Sometimes projects don’t move forward if a company isn’t ready.

Timelines rarely align perfectly. Academic projects move at one pace, tech product cycles at another—collaboration often succeeds only when the right person is in the right place at the right time.

Researchers can play a supporting role. Sometimes the most valuable contribution is giving internal champions the evidence they need to move ideas forward inside their company.

For Caoilte, being part of this collaboration reinforced why Tech Coalition and Safe Online invest in creating spaces for researchers and tech companies to meet. That investment doesn’t stop at the convenings themselves: The Research Fund also works to facilitate connections afterward and support grantees with communication tools and accessible research outputs. His experience shows what’s possible when rigorous research meets openness from industry: products can change, harmful behaviors can be reduced, and new ways of protecting children online can emerge.

For tech companies, it’s an invitation: there’s tremendous value in engaging with independent researchers who bring fresh perspectives and evidence to some of the most complex challenges in online safety. 

Stay in the loop.

Our purpose in detail

We are here to ensure every child and young person grows in to the digital world feeling safe, and is protected from harm.

We support, champion, and invest in innovative partners from the public, private, and third sectors working towards the same objective.

We believe in equipping guardians and young people with the skills to understand and see danger themselves once accessing digital experiences without supervision.

We'd love to have a chat

We're thrilled you're interested in donating to Safe Online - pop in the details below and we will get back to you to set up a discussion.