Today I appeared with my Head of Investigations, Mr Toby Dagg, at the Parliamentary Joint Committee’s hearing on Intelligence and Security: Inquiry into extremist movements and radicalism in Australia.
The Committee heard about my role as Australia’s eSafety Commissioner to help safeguard Australians at risk from online harms and promote safer, more positive online experiences. My blog focuses on some of the remarks I made at the Committee.
When it comes to the online harms I see every day, words and symbols matter.
They not only reflect societal ills, but they can also surface real world intent, and ultimately lead to real world harms.
Hate speech and the aftermath of Christchurch
Our research shows that around 1 in 7 Australian adults have been the victim of online hate speech and for younger adults it’s closer to 1 in 5.
Hate speech is designed to dehumanise and disenfranchise and causes a fraying at the edges of our social cohesion, driving already marginalised groups into the open arms of extremist groups who offer acceptance and purpose.
And these groups are very good at harnessing the power of social media to spread their beliefs to new audiences and broadcast and weaponise real world harms to further spread and promote their ideologies and recruit new followers.
We saw this play out to devastating effect with the 2019 Christchurch attacks which saw 51 innocent lives taken by a lone gunman.
As noted in the Christchurch Royal Commission report, the first link in the devastating chain of events that led to this terrible act involved the perpetrator being radicalised online.
Drawn to both mainstream video sharing platforms and darker corners of the internet, the perpetrator’s deformed world view was validated through increasingly extreme and hateful content.
By the time his hatred spilled out into the real world, social media again provided the means for him to live broadcast mass murder and spread his twisted manifesto.
And it was following Christchurch that eSafety was granted a new function and powers to protect Australians from exposure to terrorist and violent extremist material.
This included the power to issue notices to websites and hosting services providing access to abhorrent violent material and the power to direct ISPs to impose a short-term block of websites providing access to this type of material in an online crisis event.
To date, my office has issued 23 notices concerning content depicting beheadings, shootings and other murders, with the content taken down or restricted for Australian users in 93% of these cases.
But while these powers are an important tool to protect Australians from exposure to this harmful online content, our AVM scheme is limited in scope, is perpetrator and accomplice focused, and does not grant us investigative powers.
We work with the tools we have, and this tool kit is also bolstered by taking a holistic approach to the full spectrum of online harms, focusing on protection through our reporting schemes, research and evidence-based prevention and education programs, and the promotion of proactive and systemic change from within the technology industry.
We are also observing some tectonic shifts since Christchurch that we need to heed and keep pace with.
As we saw with the 2019 Halle attack that was livestreamed over gaming platform Twitch, smaller platforms are at risk of being exploited by those seeking to disseminate hateful propaganda.
This is a logical displacement of the threat: more mainstream companies have shored up their capacity to detect bad actors attempting to weaponise their platforms, and increasingly share crucial intelligence through groups such as the Global Internet Forum to Counter Terrorism.
Safety by Design
Safety by Design is therefore a critical mindset for start-up and mid-tier companies to adopt. Safety by Design encourages services to assess and anticipate online risks, embedding safety protections into the full product design life cycle.
In the coming month, eSafety will be releasing our Safety by Design interactive assessment tools for start-ups, mid-tier and enterprise companies so that they can better identify their safety shortcomings and address gaps through targeted advice about industry best practice.
It may not come as a surprise that many of the sites trafficking in pro-terror content and abhorrent violent material share hosting and other infrastructure services with sites hosting child sexual abuse material.
The entities enabling the distribution of illegal online content – and other appalling material such as bestiality and sickening violence – often operate in full view.
Many of these entities operate murky, labyrinth concerns that appear engineered to frustrate regulatory action. However, others are enterprise-grade companies seeking to capitalise on the commercial advantage that comes from protecting the worst of the worst.
This distributed technology ecosystem survives through use of various tools. These include reverse-proxy services to shield a site’s hosting location and techniques for obscuring DNS records.
Other features of this ecosystem include the use of decentralised and end-to-end encrypted networks such as Signal and Telegram.
Such an ecosystem – with no concept of intermediary liability or central accountability for compliance – does raise serious questions about preserving concepts such as accountability, responsibility and care for user safety.
eSafety is thinking deeply about what this means for the traditional notice and take down model and the future safety of the online world more generally. We will release a tech trends and challenges paper on this issue in the coming months.
Prevention and education: pillars for change
In addition to the work we do through our regulatory schemes and investigations, prevention through evidence-based education and resources remains one of eSafety’s core priorities. Our youth and social cohesion research revealed 53% of teens encountered hate speech online and those identifying as Muslim received the greater share.
The research also revealed 1 in 3 young people had seen videos or images promoting terrorism online.
And it’s this age group that we need to reach to prevent them from feeling further disenfranchised and marginalised in society and making them more vulnerable to influence by extremist groups.
To address these issues, we developed the Young & eSafe program and the interactive game The Lost Summer which is used in secondary schools across Australia.
eSafety has programs to help protect Australians from cradle to grave, but for young people parents remain the frontline of defence in the battle to protect their children from online extremism.
Schools also have a big role to play here, by reinforcing the four Rs of the digital age - respect, resilience, responsibility and (critical) reasoning in their curriculum and programs.
We’re giving parents and educators the tools to start having these conversations early because we know when children reach their teens it can be much harder to reach them and ratchet things back.
But it’s not only parents and educators and society generally that needs to help.
The technology companies also have a role to play here too. I earlier touched on the world-leading Safety by Design initiative to ensure user safety and rights are at the centre of the design, development and release of online products and services.
These principles and assessment tools were developed with and for the industry to help change the cultural ethos around product development so that safety is less an afterthought and more an integrated and mindful imperative.
Proposed Online Safety Bill
And while we do not currently have powers relating to adult cyber abuse material, the proposed new Online Safety Bill could allow eSafety to consider hateful and dehumanising commentary and symbols where they are used to cause an individual Australian adult serious harm online.
We know that the vast majority of cyber abuse is “intersectional” with those from diverse cultural, religious and linguistic communities and other at-risk cohorts being disproportionately targeted with serious cyber abuse.
So, we have made public our statement on how eSafety Protects At-Risk Voices Online and have built a diversity and inclusion team that is working with diverse community groups to develop tailored resources that meets their unique audience needs.
The draft Bill also establishes Basic Online Safety Expectations (the Expectations) and mandatory reporting requirements that will allow the eSafety Commissioner to require online services to provide specific information about their compliance.
This could include requiring services to explain how they are working to minimise terrorist and abhorrent violent material on their service.
eSafety has a unique position in the online safety architecture that the Australian Government has developed.
This allows us to play an important role in the complex area of counter terrorism and countering violent extremism.
eSafety will continue to work closely with our government, industry and non-government partners to protect all Australians online.