Recent research from the UK supports what Thorn’s own research also reveals: more young people are facing sextortion during their time online.
In 2025, reports of blackmail attempts involving minors in the UK increased by 34%.
These situations can get worse quickly, making young people feel scared, alone, or unsure of where to get help. With so many apps and ways to connect, it’s easier than ever for someone to take advantage of trust.
But caregivers can make a real difference.
Open, judgment-free conversations matter. When young people feel safe coming to you without fear of blame or punishment, they’re much more likely to ask for help if something feels wrong.
Begin these talks early. Stay interested. Let them know they can always come to you.
Also, remind them of something important:
What happens to them is never their fault, and they don’t have to face it alone.
For years, social platforms have largely been treated as neutral spaces. This gave technology companies leeway to scale and grow. But it also meant that environments were created without considering the harm they could enable or recognizing the full scale of preventive safety needed.
Recent rulings and regulatory actions signal that platforms are increasingly being held accountable for the systems they design.
At the center of this shift is a critical question:
Could this harm have been prevented?
This is why Thorn has focused on building a digital safety net with technology and partnerships that aim to detect abuse, identify victims faster, and prevent exploitation before it escalates.
The shift we’re seeing now reinforces something we’ve believed from the start:
Child safety can’t be an afterthought. It must be built into the foundation of technology itself.
#ChildSafety #SafetyByDesign #TechForGood
Sextortion is an increasingly common harm threatening young people, and it doesn’t always look the same.
Our research found that 1 in 5 teens reported experiencing sextortion. The demands vary widely, but one pattern shows up again and again: situations can escalate quickly, moving from contact to threats in a matter of hours.
Victims often feel scared, overwhelmed, or embarrassed. Those feelings can make it harder to ask for help, even when support is available. That’s why conversation matters. Talk with the young people in your life about sextortion, what it can look like, and what to do if it ever happens. Let them know they won’t be in trouble and that they don’t have to handle it alone.
Silence is not a solution.
Support, trust, and early intervention can make all the difference.
20% of minors who have an online sexual encounter don’t turn to anyone for support. 84% of those same minors, however, do use online tools available to them.
The reasons are many, but it often boils down to kids being worried about an adult’s reaction to the situation. To help children in their time of need, informed adults can keep them feeling supported and secure when something doesn’t feel right online.
This process starts with small conversations. Stay aware of how the young people in your life use technology, and keep an eye out for potential red flags. Be open and non-judgmental. And most of all, let the young people know that being taken advantage of online is not their fault.
If you’d like help talking with your kids, please visit the Thorn website for dedicated guides and tips for parents and caregivers.
The Thorn Cause Community is made up of people from all walks of life who share one thing in common:
They believe technology can be used for good to protect children online.
Find out why Adam Leibsohn and Jen Kao support Thorn, and explore all the different ways we engage and accelerate the child safety ecosystem. From technical innovation to child victim identification, there’s a purpose our supporters are passionate about.
Learn more about Jen and Adam’s donor journey at the link in our bio.
Identifying abuse early can make all the difference for a child.
One of the biggest challenges in child safety is recognizing harmful behavior before it escalates — and reducing the amount of time a young person remains at risk.
That’s why we continue to build in technology that helps surface warning signs sooner.
Last year, we introduced grooming detection within Safer, our product to help tech platforms detect child sexual abuse. This feature helps identify language patterns associated with grooming in online conversations, surfacing potential exploitation earlier.
Earlier detection means earlier intervention and a greater chance to prevent harm before it worsens.
This is what it looks like when technology is built for good: practical tools that strengthen the digital safety net and help transform how children are protected in the digital age.
April is National Child Abuse Prevention Month — a time to better understand how abuse happens, so we can stop it earlier.
Grooming is a process. An abuser builds trust, gradually shifts boundaries, and creates a sense of secrecy or dependence to manipulate a young person.
This isn’t new. These tactics have existed in offline abuse for a long time. What has changed is access.
Today, young people can be reached through games, chat apps, and social platforms. Any online space where relationships can feel real and consistent, even without ever meeting in person. That can make it harder to recognize when trust is being misused.
Understanding how grooming works is one of the most powerful ways to help protect kids. When adults recognize the signs and create space for open conversation, it becomes easier for young people to speak up when something doesn’t feel right.
Awareness is one of the first steps in prevention.
👇Explore our resource guide to learn how grooming happens, and what you can do to help prevent it.
April is National Child Abuse Prevention Month- a reminder that protecting children often starts with small, every day conversations.
It’s easy to assume, “That wouldn’t happen at my child’s school.” But for many young people, nude image sharing — and resharing — is already part of their reality.
Helping kids understand the impact of sharing nude images isn’t just about rules. It’s about helping them recognize how quickly harm can spread, and how it can affect someone’s safety, relationships, and sense of control.
These conversations don’t have to be perfect or happen all at once. Small, thoughtful questions can open the door. Look for natural moments to start: a news story, a show, a conversation about friends. What matters most is that your child knows they can talk to you without fear or judgment.
Prevention doesn’t always look like a big moment. Often, it starts with a simple question. Explore our full discussion guide for more ways to navigate this topic with confidence. ⬇️
In our latest Youth Perspectives research, we highlight an interesting insight learned from talking with young people:
🔸 11% of minors reported sharing nude images or videos of themselves
🔸 13% of minors considered doing so
One of the primary reasons the second group didn’t share was fear that the image would be leaked or reshared without their consent. This is a great example of why open, judgment-free conversations about nudes, technology use, and privacy are so important.
During National Child Abuse Prevention Month, this is an important reminder: when young people understand the risks, they’re more likely to pause, question, and protect themselves.
The numbers are high, but there is some positive momentum. Since we started tracking Youth Perspective data in 2019, the number of minors who think it’s normal to share self-generated nudes has been on a strong downward trend. Seven years ago, nearly 30% of respondents thought it was normal. In our last report, only 19% of minors felt the same way.
That’s a positive sign that education and awareness could be making a difference. But the work, obviously, isn’t done. As parents and caretakers, it’s important to keep lines of communication open and have honest discussions about privacy, consent, and the sharing of intimate images.
The message is loud and clear: deepfake nudes are here, and they are harmful. Young people are already negotiating the emotional, psychological, and social harms inflicted by these abusive images. In a world where there are already so many avenues for online harm, it is frustrating to see a new technology quickly adopted for abuse and exploitation.
As always, parents and caretakers can have a meaningful impact on preventing and responding to online threats. We’re here to help. There’s a free, expert-backed guide meant to help adults get a handle on this new reality young people face. In it, you’ll find advice on how to:
✅ Start meaningful, age-appropriate conversations about consent and online safety
✅ Recognize the risks and realities of deepfake nudes
✅ Equip your child to navigate digital spaces safely and use technology appropriately
✅ Take actionable steps to prevent the spread of harmful content
Visit our blog to find more information about deepfakes and how to inform and protect your children.
Tomorrow, the legal basis that allowed platforms to detect child sexual abuse material (CSAM) in Europe expires. That will create a child safety gap affecting children worldwide. Children will now pay the price for policymakers’ failure to reach an agreement in time.
This gap wasn’t inevitable, and it could have been prevented. To find out what’s happening and why we’re here, read our full explainer, link in bio.