eSafety joins global project combating online child abuse with AI

The eSafety Commissioner has signed on to a pilot with Project Arachnid, an innovative tech platform based at the Canadian Centre for Child Protection, designed to reduce the availability of child sexual abuse material online.

eSafety’s commitment to Project Arachnid means its Cyber Report team will work collaboratively with investigators and analysts across the globe to scale up the capacity and impact of the project in identifying and removing child sexual abuse material from the internet.

Project Arachnid autonomously detects child sexual abuse material at a rate much faster than human analysts are capable of, however human analysis is still required to help classify images and confirm quality of data.

“Through this work, the Cyber Report team will make a significant impact in restricting the availability of child sexual abuse material to those who are seeking and distributing it,” says eSafety Commissioner Julie Inman Grant.

“We have an important role to play here in Australia, but ultimately, this is a global problem requiring a global solution, and we’re proud to be partnering with likeminded agencies around the world to fight this scourge.”

Analysts from eSafety’s Cyber Report team will help classify images detected by the Arachnid crawler, contributing to the international effort to build a comprehensive central database of child sexual abuse material ‘hashes’, or digital fingerprints.

Cyber Report will also gain access to the Arachnid Hash List of known child sexual abuse material, reducing investigators’ exposure to harmful content and improving the welfare of staff.

“An additional benefit of this pilot is that, through information sharing, our investigators will be exposed to less harmful content, reducing the impact of the incredibly important work they do,” says Toby Dagg, Manager Cyber Report.

Background

Project Arachnid is a technological tool designed to reduce the availability of child sexual abuse images online and help break the cycle of abuse experienced by survivors.

Project Arachnid discovers child sexual abuse material (CSAM) by crawling URLs across the clear web known to have previously hosted CSAM. The platform determines that a particular URL contains CSAM by comparing the media displayed on the URL to a database of known signatures that have been assessed by analysts as CSAM. If CSAM is detected, a notice is sent to the hosting provider requesting its removal.

Estimates vary as to the quantity of child sexual abuse material available online; the United Nations states that approximately 750,000 people are accessing such material at any given moment.

Every month, Project Arachnid detects more than 500,000 unique images of suspected child sexual abuse material requiring analyst assessment. To date, Project Arachnid has sent more than 1.6 million notices for removal of child sexual abuse material to online providers.

For more information or to request an interview, please contact: