Standards submission publication: statement

eSafety thanks all those who participated in public consultations on its draft Industry Standards for Designated Internet Services (DIS) and Relevant Electronic Services (RES).

eSafety is currently considering feedback received during this consultation process, including submissions we have published today on our website

The Standards will require some online services to implement additional measures aimed at reducing the spread of child sexual abuse material and pro-terror content on their platforms.

They will not require industry to break or weaken end-to-end encryption, monitor the text of private communication or indiscriminately scan large amounts of personal data.

A number of stakeholders indicated their preference for further clarity on these and related issues. eSafety is closely considering this feedback and any potential amendments that could provide greater clarity and certainty to industry participants and end-users in the standards.

However, it is worth reiterating that eSafety moved to draft these standards because the codes industry submitted for registration were found not to provide appropriate community safeguards. 

We recognise these standards will apply to broad industry categories covering a range of services, and that they will require differing approaches to detecting and removing illegal content such as child sexual abuse material. 

To that end, the draft standards proposed a technical feasibility exemption for certain obligations. Where exemptions are applied, the draft standards would require providers to take appropriate alternative action. 

No specific technologies or methods were prescribed. The latter would be up to the companies themselves.

Fundamentally, eSafety does not believe industry should be exempt from responsibility for tackling illegal content being hosted and shared freely on their platforms.  

eSafety notes some large end-to-end encrypted messaging services are already taking steps detect this harmful content. These include scanning the non-encrypted parts of services – including profile and group chat names and pictures that might indicate accounts are providing or sharing child sexual abuse material – as well as considering behavioural signals to help identify perpetrators. 

These and other instances of industry best practice – such as making in-app user reporting options directly to the platforms readily accessible – are practical examples of measures companies with end-to-end encrypted services can take to reduce the risk of their services being misused to perpetuate child abuse and other abhorrent crimes.  

Clear and intuitive in-app reporting of illegal content directly to the platform – particularly where child sexual abuse material detection is not taking place – is particularly important in ensuring the platforms are informed and can remove content and prevent its viral spread.

Final versions of the standards will be tabled for consideration by Parliament later this year.

The submissions received during the public consultation process can be found here

For more information or to request an interview, please contact: