Skip to main content

Double Your Impact to Protect Children

Every day, NCMEC works tirelessly to protect children and prevent abuse—but we can’t do it alone. Give today to help continue this vital work, and your donation will be matched, dollar for dollar, through January 1, 2025, up to $100,000.

Child Sexual Abuse Material

a teenage boy in front of a computer with his hand over his face

Overview

United States federal law defines child pornography as any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old). Outside of the legal system, NCMEC chooses to refer to these images as Child Sexual Abuse Material (CSAM) to most accurately reflect what is depicted – the sexual abuse and exploitation of children. Not only do these images and videos document victims’ exploitation and abuse, but when these files are shared across the internet, child victims suffer re-victimization each time the image of their sexual abuse is viewed. In a recent survey led by the Canadian Centre for Child Protection, 67% of CSAM survivors said the distribution of their images impacts them differently than the hands-on abuse they suffered because the distribution never ends and the images are permanent.

It’s important to remember CSAM consists of much more than just images and video files. While CSAM is seen and transmitted on computers and through other technology, these images and videos depict actual crimes being committed against children. The human element, children at risk, must always be considered when talking about this offense that is based in a high-tech world.

The disturbing reality is that the internet platforms we use every day to connect with each other and share information, including social media, online gaming, and e-mail, are now being used to disseminate and collect CSAM. CSAM can be found in virtually any online realm.

Who are the Victims?

While there is limited research regarding victims of child sexual abuse material, it is a growing field of research and study to better understand the child victims and the offenders.

In March 2018, two studies on this topic were released. The first study is Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims, which is based on data collected by NCMEC’s Child Victim Identification Program through 2014. The second study is Towards a Global Indicator on Unidentified Victims in Child Sexual Exploitation Material5, which is based on data in Interpol’s global system.

Below are key findings from these two studies:

  • Girls appear in the overwhelming majority of CSAM.1 
  • Prepubescent children are at the greatest risk to be depicted in CSAM.2
  • When boys are victimized, they are much more likely than girls to be subjected to very explicit or egregious abuse.
  • On average boys depicted in CSAM are younger than girls and more likely to have not yet reached puberty.3
  • 78% of reports regarding online enticement4 involved girls and 15% involved boys (in 8% of reports, the gender of the child could not be determined).

1 - Seto, M. C., Buckman, C., Dwyer, R. G., & Quayle, E. (2018, March 28). Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims(Rep.). Retrieved April 1, 2018, http://www.missingkids.org/content/dam/pdfs/ncmec-analysis/Production%20and%20Active%20Trading%20of%20CSAM_FullReport_FINAL.pdf

2 – Ibid.

3 – Ibid.

4 - Online enticement is a broad category of online exploitation, including sextortion, and involves enticing a child to take sexually explicit images, and/or ultimately meeting in person for sexual purposes, engaging the child in a sexual conversation online or, in some instances, to sell/trade the child’s sexual images.

5-ECPAT International and INTERPOL. (2018). Towards a global indicator on unidentified victims in child sexual exploitation material.  Retrieved from http://www.ecpat.org/wp-content/uploads/2018/03/TOWARDS-A-GLOBAL-INDICATOR-ON-UNIDENTIFIED-VICTIMS-IN-CHILD-SEXUAL-EXPLOITATION-MATERIAL-Summary-Report.pdf 

By the Numbers

The CyberTipline has received more than 195 million reports related to CSAM since its inception in 1998.

CVIP has reviewed more than 425 million images/videos.

More than 30,000 victims have been identified by law enforcement and submitted to NCMEC.

What NCMEC is Doing About it

Operating the CyberTipline

In 1998, with the help of a private donation and after receiving an increase in reports relating to the online sexual exploitation of children, NCMEC created the CyberTipline. The CyberTipline provides an online mechanism for members of the public and electronic service providers (ESPs) to report incidents of suspected child sexual exploitation including:

  • online enticement of children for sexual acts
  • extra-familial child sexual molestation
  • child pornography
  • child sex tourism
  • child sex trafficking
  • unsolicited obscene materials sent to children
  • misleading domain names 
  • misleading words or digital images on the internet

For definitions and more information on these reporting categories and/or to make a CyberTipline Report, visit report.cybertip.org.

inhope logo Proud Partner of INHOPE

As part of our work to prevent the further victimization of children and to discover trends that can assist in preventing these crimes, NCMEC staff may review content reported to the CyberTipline and then the reports are made available to law enforcement for their independent review.  

Electronic Service Provider (ESP) Reporting

U.S. federal law requires that U.S.-based ESPs report instances of apparent child pornography that they become aware of on their systems to NCMEC’s CyberTipline.  NCMEC works closely with ESPs on voluntary initiatives that many companies choose to engage in to deter and prevent the proliferation of online child sexual exploitation images.  To date, over 1,400 companies are registered to make reports to NCMEC’s CyberTipline and, in addition to making reports, these companies also receive notices from NCMEC about suspected CSAM on their servers.

Are you an ESP who would like to register with NCMEC? Click here.

Assisting in Victim Identification Efforts

cvip logo
videntifier logo

Special thanks to Videntifier.

The Child Victim Identification Program began in 2002 after NCMEC analysts repeatedly saw images of the same child victims in their reviews and began tracking which victims had been previously identified by law enforcement. So far, more than 19,100 children have been identified.

Today CVIP operates with a dual mission: help provide information concerning previously identified child victims, and help locate unidentified child victims featured in sexually abusive images so that they may be identified and rescued.

Additionally, the Child Victim Identification Program provides training and educational assistance to law enforcement and attorneys on how child victims of sexual exploitation can be identified.

Empowering Survivors

More and more, survivors of CSAM speak to the long-lasting damage and impact of their images and videos being circulated on the internet. The lack of control of both the files’ existence and circulation leaves the survivors struggling in their recovery.

Using new technology and working with like-minded partners, NCMEC is working with the ESP industry and with children and their families to identify these images and have them tagged for removal from ESP servers. This empowers and allows law enforcement and child advocates to tell the survivors that something CAN be done to limit these files online and remove them when they are flagged.

NCMEC also provides information for survivors who want to take quick action if they are made aware of their images or videos online. Learn how to contact the internet service providers to report files circulating online.

Supporting Victims & Families

NCMEC provides assistance and support to families impacted by child sexual exploitation. We offer crisis intervention to families as well as local referrals to appropriate professionals for longer-term support. Families of exploited children often feel alone in their struggle and overwhelmed by the issues impacting their lives. NCMEC’s Team HOPE is a volunteer program that connects families to others who have experienced the crisis of a sexually exploited child. These trained volunteers offer peer support, coping skills, and compassion.  

In addition, NCMEC is committed to addressing the long-term needs of survivors of CSAM by providing resources and avenues for the continuum of care after the abuse has stopped. NCMEC is creating a network of mental health therapists that specialize in CSAM cases, education for legal professionals on how to seek restitution and represent survivors in court, and increasing awareness of the unique and sensitive nature of this crime to law enforcement and other child advocates. Our hope is that this holistic approach will provide the continuing and ever-changing support survivors need in the years following the abuse.

Preventing Abuse Through Education

NCMEC utilizes the expertise it gains by operating the CyberTipline and CVIP to create and provide prevention and educational programs to parents and guardians, as well as technical assistance and educational programs to the public, law enforcement and other child-serving professionals regarding child sexual exploitation. Using data from actual CyberTipline reports enables NCMEC to craft outreach messaging that takes into account trends in the sexual exploitation of children and provides prevention and educational resources to help address these issues. NCMEC’s central education programs include NetSmartz and KidSmartz.  

Success Stories

In December 2017, the CyberTipline® received a report from a registered electronic service provider regarding the transmission of apparent child pornography via their social networking service. As part of their report, the ESP provided incident information, including an email address, screen name, images of the apparent child pornography and IP addresses associated with the reported files. 

A NCMEC analyst assigned to the report viewed multiple images, which appeared to be unfamiliar, and a chat log suggesting the reported user was enticing the child victim to sexually molest her toddler-aged relative. After querying the CyberTipline with the information submitted by the ESP, it appeared the reported user was associated with multiple other reports and had possibly been aggressively enticing multiple children to produce child pornography. The email addresses provided by the ESP linked to a social media profile including a location and possible gang affiliation. Based on the uploaded content and chat log, NCMEC staff prioritized the report.

The same day, the reporting ESP sent an escalated report related to the reported user alleging he had been using several accounts to coerce multiple minors to produce child pornography images. On various occasions, the user appeared to have enticed child victims into sexually molesting younger relatives. The analyst added value to this report, connected additional reports to the reported individual, and reclassified the escalated report to the highest priority.  The report was made available to an Internet Crimes Against Children (ICAC) Task Force in the Mid-Atlantic states, which alerted a local police department. The ICAC continued to update NCMEC about the case and request additional information as needed. As the reported user is a citizen of another country, Homeland Security Investigations assisted in his arrest. Due to the diligent work of law enforcement, 15 victims, including a 2-year-old girl, were rescued.

 

 

In January 2018, the CyberTipline received a report from a member of the public alleging the reported individual was addicted to child pornography. It was also alleged the reported user was previously employed in a school and may continue to have access to children. The reporting person provided a name, address, telephone number and username for the reported user.

An analyst used the information provided in the report to find a potentially related social media profile associated with the telephone number and username provided by the reporting person. The profile contained the same name listed in the report and suggested the reported user had access to multiple children. Based on that information, the analyst escalated this report and made it available to law enforcement in a southern state. Per a joint investigation by law enforcement agencies in that state, the user was arrested.

The reported user is accused of possessing “material containing a representation of a minor, unknown female, approximate age between 8 and 10 . . . engaged in sexual activity.” The reported user faces charges to include one count of Indecent Liberties with a Minor and three counts of 2nd Degree Exploitation of a Minor. The victim appears to be a child from the daycare where the user worked.