Safer helps publishers spot illegal user-generated content

Thorn wants to eliminate child sexual abuse material from the internet. The non-profit's new content moderation tool makes it easy to hash images and videos and identify the bad ones.

Cracking the code: Why more companies are focusing on AI projects Karen Roby interviews a Gartner analyst about why some companies are doubling down on artificial intelligence and machine learning projects.

YouTube just got a $170 million fine for invading the privacy of its youngest viewers.

President Trump signed the FOSTA-SESTA bills into law a year ago. These new rules were meant to stop sex trafficking online by carving out an exception to Section 230 of the 1996 Communications Decency Act. 

This year Congress has been making more noise about narrowing the safe harbor created by Section 230. Section 230 has protected website publishers from being sued over user-generated content. 

Website publishers can hope that the next law or fine for misuse of content or data gathering practices doesn't affect them. Thorn is hoping ISPs and content managers will take control of their own destiny and address problematic content proactively.

Thorn's platforms are designed to give law enforcement officials, government leaders and advocates 21st century tools to address child sexual abuse and human trafficking.

Founded by Ashton Kutcher and Demi Moore, this tech-centric non-profit's goal is to eliminate child sexual abuse material from the internet. Thorn uses that term instead of "child pornography" to describe what the content conveys and to link the content to the crime. 

Thorn has a product in beta designed for websites that host user-generated images and videos. Safer is a content moderation platform is designed to identify child sexual abuse material. Julie Cordua, Thorn CEO, said that Safer helps stop the viral spread of this content and can lead to the identification of children in the images and videos. 

"We offer a module so companies can hash their images and send hashes to our service to bounce against a list of hashes of confirmed child sexual abuse content," she said. "Then they can detect if they are hosting illegal child pornography and take action." 

The moderation platform allows site owners to identify the content in-house without transferring any information to Thorn. Safer also has a white paper that highlights polices and best practices for sites that want to address child sexual abuse material.

The amount of this content available online is growing every year. The National Center for Missing and Exploited Children runs the CyberTipline, a tool for individuals and ISPs to use to report instances of suspected child sexual exploitation.

In 2018, the tipline received more than 18.4 million reports covering child sexual abuse images, sextortion, and child sex trafficking.

Getting this content identified and offline is key to Thorn's goal. The organization states that the majority of the content features children younger than 12 and includes extreme acts of sexual violence. This content can live online forever and reappear on new sites even if it is removed from one. 

Earlier this year, Thorn won a chunk of funding from The Audacious Project, a TED effort aiming to make philanthropy more collaborative and more effective. Thorn plans to use the funding to expand Safer as well as its other product: Spotlight.

Spotlight uses language and data analysis to identify patterns and suspicious transactions.
The platform uses machine learning algorithms to help police officers prioritize leads. Spotlight also connects disparate data sources to help officers understand the history and geography of a person's trafficking situation.
Thorn reports that Spotlight is used by officers in all 50 states and Canada and that the web-based tool has helped to identify 31,197 victims of human trafficking and more than 10,000 traffickers in the last three years.
Detectives with the Bernalillo County Sheriff's Office in California have used Spotlight to track a victim's movement across the country by using dates and times of social media posts and changes in phone numbers.

Thorn built Spotlight in partnership with Digital Reasoning and with support from McCain Institute, Google Foundation, Hovde Foundation, and AWS.

Also see

screen-shot-2019-09-06-at-5-08-19-pm.png

In 2018, more than 45 million images and videos of suspected child sexual abuse were reported to the National Center for Missing & Exploited Children —  double the amount reported in 2017. 

Source: National Center for Missing and Exploited Children