Skip to content

How Thorn Helps Investigators Find Children Faster

July 16, 2024

6 Minute Read

It’s a common scenario in the fight to identify and defend children from sexual abuse:

A popular social media platform discovers child sexual abuse material (CSAM) circulating on its site. Its team reports these files to the National Center for Missing and Exploited Children (NCMEC), who then alerts law enforcement. Officers review the data and if there’s sufficient evidence for a warrant, they initiate a search — seizing laptops, phones, hard drives, and other devices from the suspected perpetrator.

Now, the officers face a daunting task: They must sift through all that digital evidence — sometimes millions of files — to find clues that could help identify the child victims.

These forensic reviews can take weeks, even months. Meanwhile, children may be enduring active abuse. The faster officers find these clues, the faster they can remove those children from harm.

That’s where Thorn’s CSAM Classifier plays a critical role in speeding up these investigations. Using state-of-the-art machine learning, the classifier automatically identifies which files are likely to be CSAM and categorizes them for officers. By processing more files faster than a human alone could do manually — often within mere hours — the classifier accelerates officers’ abilities to solve such cases.

Crucial to these investigations is the ability to identify new CSAM — material that exists but hasn’t yet been reported to NCEMC and categorized as CSAM. This new material often represents children currently being abused and is therefore key to removing them from harm. Our classifier empowers officers to find new CSAM far faster.

Agencies around the world use Thorn’s CSAM Classifier integrated within their forensic processing software. The time it saves matters when children’s lives are on the line.

Speeding victim identification

Tips to law enforcement regarding child sexual abuse might come from the public, NCMEC, or even another agency that’s conducted an investigation and reported that a child victim is located in a particular jurisdiction. 

Today, when reviewing the files on seized devices, officers face vastly larger troves of data, since storage on the average computer has increased exponentially over the years. 

To put the scale of files into perspective, think of all the photos and videos on your phone. Now on your cloud, and your desktop, and so on. Have video games downloaded? That’s hundreds of images too. These gigabytes or even terabytes can equal tens of millions of files.

Each one must be processed because perpetrators attempt to hide CSAM. For example, they might label those files as .txt to make them look like text files.

By using the CSAM Classifier — which reviews 15 to 60 images per second depending on the hardware and deployment — officers can process all those files at impressive speed and scale, changing the game on what used to be a painstakingly manual process. 

Finding CSAM is key to victim identification. Each file potentially holds a missing piece of the puzzle to locating a child: a school logo, a regional concert poster, or other clues about a child’s identity or whereabouts. Just as importantly, CSAM is often located in file folders that also contain other identifying information and clues. The classifier helps officers find these folders — which may have helpful content about 10, 20, 100 victims.

From there, officers can take the next steps to bring the perpetrator to justice and remove the child from harm. 

Taking perpetrators off the streets

Officers often have a limited window in which they can hold a suspect. Fortunately, in many U.S. jurisdictions, they may only need to find 10 or so CSAM files to charge that perpetrator. Finding those CSAM files in the suspect’s possession quickly can mean the difference between maintaining custody of a potential perpetrator and sending someone home to possibly harm again. Thorn’s CSAM Classifier gives agents that speed and efficiency.

Additionally, when it comes to sentencing, the volume of CSAM that a suspect possesses matters. By quickly identifying the full scale of CSAM in possession, agents can put a dangerous abuser behind bars for a substantial amount of time — reducing the time that person is out in the world potentially harming kids. 

Improving officers’ wellbeing

The CSAM Classifier’s automated process also has positive downstream effects on officers’ wellbeing. Imagine you’re swiping through photos on another person’s phone. Suddenly, you see a horrible image. The shocking experience sticks with you for some time. Now imagine experiencing that repeatedly over days or weeks. This kind of exposure is an occupational challenge for many types of first responders and is known as vicarious trauma

For officers involved in child sexual abuse cases, this repeated exposure is their reality. But the CSAM Classifier helps relieve it by mitigating the burden of manual reviews. The classifier detects which files are likely CSAM to various degrees and categorizes them. Then, the officers can choose to review the CSAM files when they’re ready.

That degree of control over their own exposure means a lot to investigators who are dealing with this material day in and day out.

Additionally, as the classifier works through the night, officers can go home to their families, recharge and reground themselves, staving off mental and emotional burnout.

Advantages of Thorn’s CSAM Classifier

 

Identifies new and previously unreported CSAM

If you’re trying to find CSAM on seized devices, a technology called perceptual hashing and matching is a powerful way to identify known CSAM — material that’s already been reported to NCMEC and identified as CSAM. Many of these files continue to spread virally for years or decades.

But new CSAM is being produced all the time — and often represents the active abuse of a child. Finding these files requires a powerful tool like a classifier. Equipped with Thorn’s CSAM Classifier, officers can focus on finding this new material and more swiftly identify children experiencing ongoing, hands-on abuse.

Trained directly on CSAM for accuracy

At Thorn, we train our classifier’s model on real CSAM images and videos, in part using trusted data from the NCMEC CyberTipline. This high-quality data greatly increases its accuracy. The visual nature of child sexual abuse differs from adult pornography, so classifiers that attempt to combine face and age estimations with adult pornography don’t offer the same level of detection. In fact, around 50% of CSAM doesn’t contain a face at all.

Used across industry and law enforcement, providing constant improvement

Thorn’s CSAM Classifier has been deployed in our solutions for content-hosting platforms as well as for victim identification since 2020. We work with trusted customers and partners within these groups who intentionally provide feedback on incorrect detections as well as material that allows our team to iterate on the model, expanding the sets of images it trains on. This ensures higher quality detections for all users, speeding up the process of correctly identifying CSAM and the children in it. 

Law enforcement officers on the front lines of defending children from sexual abuse serve a noble and taxing role within our communities — and are often in a race against time. The faster they can detect CSAM and find clues that help identify a child victim, the faster they can remove that child from harm and put a perpetrator behind bars. We’re proud to build technology, like our CSAM Classifier, that speeds up these life-saving efforts, creating a new chapter and brighter future for the children involved.



Get the latest delivered to your inbox