Generative AI (GAI)
Overview
Generative Artificial Intelligence (GAI) technology allows a user to create new images, videos, audio and text based on user requests or prompts. This technology has many benefits, but at the same time, NCMEC is deeply concerned about the numerous ways it is being used to sexually exploit children. Over the past two years, NCMEC’s CyberTipline has received more than 7,000 child sexual exploitation reports involving GAI, and the numbers are expected to grow as we continue to track these trends.
Key Risks
Protecting children from the harms of GAI sexual exploitation requires education and guidance from trusted adults. Understanding the risks is a critical first step to being able to help.
GAI risks to children include:
- GAI Exploitative Imagery: GAI is being used to create child sexual abuse material (CSAM) that depicts children engaged in sexually explicit conduct and nude images of children like content created by “nudify” apps. The creation and distribution of this fake imagery – including synthetic media, digital forgery and nude images of children – can have serious legal consequences and cause severe harm to victims, including harassment, bullying and psychological and emotional harm.
- Online Enticement: Individuals can use GAI tools to create fake accounts on social media to communicate with a child with the intent to commit a sexual offense.
- Sextortion: Offenders can use GAI to create explicit images of a child that are used to blackmail the child for additional sexual content, coerce a child to engage in sexual activity or to obtain money. NCMEC has seen cases in which the child refuses to send a nude image to the offender, and the offender then creates an explicit GAI image of that child to blackmail them for more explicit images.
- AI Bullying and Peer Victimization: GAI technology may be used to create or spread harmful content, such as fake images or videos. This content is often created by a child’s classmates and can end up circulating in schools.
What NCMEC is Doing About it
Using GAI to create child sexual abuse imagery and nude or exploitative images of a child should always be reported and taken seriously. NCMEC’s CyberTipline provides the public and electronic service providers with the ability to report multiple forms of suspected child sexual exploitation, including CSAM and online enticement. After NCMEC handles a CyberTipline report, all reports are made available to the appropriate law enforcement agency. To make a CyberTipline report, please visit report.cybertip.org.
If a sexually explicit image of you or someone you know – whether real or GAI-created – is circulating online, NCMEC’s Take It Down service can help. This tool allows individuals to anonymously request the removal of explicit images from participating platforms.
NCMEC also has resources to help you learn how to report exploitative content to the internet service provider and platform where the content is posted to help to mitigate the spread of the image or video. Visit Is Your Explicit Content Out There?.
NCMEC’s digital citizenship and safety program, NetSmartz, is an innovative educational program that uses games, animated videos, classroom-based lesson plans, activities and much more to help empower children to make safer choices online.
For families with a missing or sexually exploited child, NCMEC provides support services such as crisis intervention and local counseling referrals to appropriate professionals. Our Team HOPE program connects families with peers who have had similar experiences and can offer coping skills and compassion.