Skip to content

Deepfake Nudes and Other Trends in Youth Behavior Online in 2023: New Research From Thorn

August 14, 2024

6 Minute Read

 

Kids’ digital lives are flush with positive experiences, but also with very real risks — some of which are emerging so fast, society is racing to grasp them.

Yet it’s essential that we do. By understanding the risks children face online, we can develop systems to protect them against the harms introduced by rapidly advancing technologies.

Since 2019, Thorn’s methodical research initiatives have helped us understand and track youth behaviors, experiences and risks online. A pillar of those initiatives, our annual Youth Monitoring Report now features five years of consistent data collection, providing us with a powerful big-picture view of how children have been navigating the digital landscape over time.

Working directly with youth, our research team investigates young people’s attitudes and experiences with online grooming, sharing nudes, nonconsensual reshares, and whether they disclose their online sexual experiences. With this data, we can build an understanding of the State of the Issue to inform our efforts and those of our partners in the child safety ecosystem.

This year’s report, Youth Perspectives on Online Safety, 2023 reveals that some areas of concern persist at similar rates, while others — such as the growing threat of youth creating deepfake nudes of peers — are emerging as quickly as the technologies that enable them.

What this year’s data reveals about youth online safety:

As we look at 2023’s findings in relation to the previous years, we see that:

  • Many youth online attitudes and behaviors remain constant, such as navigating sexual interactions with adults and self-generated child sexual abuse material (SG-CSAM).
  • At the same time, the powerful capabilities and efficiencies of AI are leading to alarming new behaviors and risks.

 

Young people are using generative AI to create deepfake nudes of peers

Over the last five years, we’ve monitored the rate at which minors shared explicit imagery of their peers without consent. In 2023, 7% of minors reported they’d reshared someone else’s sexual images, yet nearly 20% reported they had seen nonconsensually reshared intimate images of others.

Importantly, the methods being used by youth to produce and share explicit imagery are evolving. With generative AI tools at their fingertips, young people are creating deepfake nudes of their peers. While the majority of minors do not believe their classmates are participating in this behavior, roughly 1 in 10 minors reported they knew of cases where their peers had done so.

The speed and efficiency of AI technologies enables mass production of these fake nude images and provides a new weapon for bullying.

Young people are continuing to have risky encounters online

In 2023, more than half (59%) of minors reported they’ve had a potentially harmful online experience, and more than 1 in 3 minors reported they’ve had an online sexual interaction. One in five preteens (9-12-year-olds) reported having an online sexual interaction with someone they believed to be an adult.

Consistent with previous years, 1 in 4 minors agree it’s normal for people their age to share nudes with each other, and 1 in 7 minors admit to having shared their own explicit imagery (SG-CSAM).

Among minors who have shared their own SG-CSAM, 1 in 3 reported having done so with an adult.

Children view platforms as key in helping them avoid and defend against threats

Youth continue to rely on in-platform safety tools. Minors who had an online sexual interaction were nearly twice as likely to use online reporting tools rather than seek offline support, such as from a caregiver or a friend.

Yet, 1 in 6 minors who experienced an online sexual interaction did not disclose their experience to anyone. This data point remains steady and underscores the need for greater youth-friendly safety tools as well as offline education that encourages judgment-free conversations between caregivers and youth.

Minors are navigating financial threats in some online sexual interactions

In 2023, a new question was added to the survey to assess the scale of sextortion among minors. Sextortion is the threat to leak explicit imagery depicting the victim if they do not comply with demands. In an adjacent extensive research effort, Thorn focused specifically on this rising trend, uncovering how financial sextortionists are targeting primarily teenage boys.

Our youth surveys confirmed these occurrences — 1 in 17 minors reported having personally experienced sextortion.

Education remains essential to keeping kids safe

The insights from this year’s report illuminate the vulnerabilities of youth online and stress the urgency of evolving education, awareness and prevention programs that match the evolving landscape.

As kids continue to explore sexual experiences through the technology at their fingertips, sharing nudes will likely continue to spark natural curiosity. It’s critical that we know how new tools, like generative AI, influence young people’s actions. At Thorn, we’re dedicated to ongoing research that keeps our team and those empowered to protect children informed of these present and emerging risks.

How to talk to children about preventing deepfake nudes

Fake images of all kinds have accelerated with deepfake-generating AI tools and society is scrambling to respond. Thorn’s new discussion guide on deepfake nudes helps parents understand the issue and offers guidance for talking to children about the dangers of resharing nudes, along with the importance of online and app safety.

Check out Navigating Deepfake Nudes: A Guide to Talking to Your Child About Digital Safety for resources and conversation starters.

Thorn is on a mission to defend children from sexual abuse and our research plays a key role in that quest. Your support is vital to ensuring this work continues. Please join us and help us build a safer digital world for our kids.



Get the latest delivered to your inbox