Wed, 11 Dec 2019 - 12:56
Viewed 445 times

Speech to the National Press Club: Keeping Australians Safe Online

E&OE.....
 

Introduction

It is good to be here at the National Press Club to speak about how we keep Australians safe online.

This has been a priority for our Liberal National Government.

We set up the world’s first eSafety Commissioner, tasked with keeping all Australians safe online.

The Commissioner has strong tools to protect children against cyberbullying and to protect Australians – particularly women – against image based abuse.

But we need to do more.

The internet has brought extraordinary economic and social benefits.  As Minister for Communications I promote the benefits of connectivity every day.

But sadly we also need to recognise that while most interactions online are positive, some will bring danger.

We need tools to recognise and guard against that danger – and as Minister for Cyber Safety, my task is to help build those tools. 

Keeping Australians safe of course is the first duty of government.

But by doing so we also support the continued growth of an internet we want to be a part of - an internet that embraces and enables the best part of humanity and not the worst.

Our Government’s approach then is a plan to keep Australians safe online – in tandem with all of our other work to leverage the online transformation of our economy.

Our Government – and our community – has clear expectations about internet safety.

Serious online abuse of an Australian is not acceptable – no matter that person’s age.

Harmful material must be taken down faster.

Attempts to send terrorist attacks viral must be stopped in their tracks.

Industry needs to step up and take more responsibility.

We need smart new approaches to getting harmful content down when fringe gore sites want it glorified.

We are putting the pressure on and keeping the pressure on.

Today I am opening public consultation on proposals for a new Online Safety Act – intended to bring these principles into law.

This Act will put pressure on industry to prevent online harms and will introduce important new protections for Australians.

I will start today by talking about what the community expects when it comes to online safety.

Next I want to speak about the evolving role of government in helping keep people safe online - and how Australia has helped lead the way globally. 

And thirdly I want to explain our plans for a new Online Safety Act.

 

What the Community Expects

Let me turn first then to what the community expects.

When people interact in the physical town square, they take it for granted that the rule of law applies. If they are assaulted, or defrauded, or otherwise harmed, they can go to the police and seek assistance, or they can go to court and seek redress.

People expect the same thing when they interact in the digital town square.  After all, this is the environment in which each Australian spends on average 1,144 hours of their life every year.[1]

Unfortunately the risks continue to evolve and some sectors of the internet industry has been slow to meet to the community’s expectations when it comes to online safety. 

I saw this in 2014 when I was working on the legislation to establish the eSafety Commissioner.

The peak body for companies including Google, Facebook and Microsoft was very resistant. In a submission to government, they said they had:

serious practical concerns with the proposed policy: a rapid take down scheme will at best take five days (much longer than industry’s own processes), the possibility that the policy will push children to undertake risky behaviour onto platforms with less highly developed self-regulatory standards and significant likelihood that the laws will be unable to keep pace with technological change.[2]

I am pleased to say we were not deterred – we implemented the policy and it has worked.  In fairness I will acknowledge that some of these same companies have since developed an excellent partnership with the eSafety Commissioner with material being taken down in some cases within 30 minutes.

But there continues to be a significant disconnect between the expectations of Australians and what is delivered by the internet industry today. A key manifestation of that disconnect is that many of today’s most popular digital products and services have not been designed with user safety in mind.

That needs to change. We need to get to a point where our online highways benefit from the same rigorous approach to safety we see in the global automotive market – where international standards, enforced by legislation made by sovereign nations, are met by global manufacturers as they supply their vehicles to global markets.

Our Government expects digital platforms and large tech firms to play their part. The eSafety Commissioner has pioneered a world-leading Safety by Design initiative, working with industry on a best practice approach to taking responsibility for the impacts of the products and services they are creating.

We want to go further. That is why today I am also releasing the Government’s Online Safety Charter – a document that sets out the Government’s expectations, on behalf of the Australian community, of social media services, content hosts and other technology companies.

The Charter endorses and expands on the Safety by Design principles. It is based on the premise that behaviour that is unacceptable offline should not be tolerated or enabled online, and that technology companies have a responsibility to mitigate and address any adverse impacts that are directly or indirectly associated with their products and services.

It outlines the Government’s expectations of service providers and the steps we expect them to take to prevent their platforms from facilitating online harm.

Of course, while we are very clear about the responsibilities of internet companies, it is also critical to equip Australians with the knowledge and tools to engage safely online.

The eSafety Commissioner has a strong focus on online safety education – providing training resources in 22 languages to teachers, parents and frontline workers.

That is why the Safety by Design principles and the Charter indicate platforms should provide tools that empower users to manage their own safety.

The Charter sends a clear message to industry – and they have the opportunity to step up and meet Australia’s expectations when it comes to preventing online harms.

My strong message to companies in the industry is to read it, refer back to it, and most importantly – integrate it into your daily practices.

I also encourage companies to continue to work with the eSafety Commissioner on the implementation of the Safety by Design initiative.

 

The proper role of government – and how Australia has helped lead the way

Having spoken about community expectations, let me turn to the role of government. 

During the very early years of the internet, when it was essentially a specialised resource for scientists and academics, many argued that it should be beyond the reach of governments.  In 1996 John Perry Barlow issued the ‘Declaration of the Independence of Cyberspace’, which began:

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.[3]

This non-interventionist approach may have been tenable in the 90s, when very few people were online – in Australia, less than one per cent of the population.[4] But it was certainly untenable once the internet became a mass market consumer phenomenon – and by 2001 in Australia, half of the country was using the internet.[5] 

The Australian Government was an early mover amongst governments. In 1999, the Broadcasting Services Act was amended with the addition of a scheme to regulate online content in Australia. At the time, this legislation was world leading.

But in these early actions, there was a certain lack of confidence. The online content scheme provided for differential treatment of content hosted in Australia and content hosted overseas, with no capacity for the regulator to take firm action against the latter. This was based on the view at the time that attempting to impose Australian law on websites hosted overseas was a futile exercise. 

Since that time the Australian Government has taken an increasingly assertive approach. In 2015 we established the world’s first Children’s eSafety Commissioner – a dedicated statutory position with a mandate to keep Australians safe online.

We established a legislated cyberbullying take-down scheme, which has assisted almost 1,600 children.

We have expanded the eSafety Commissioner’s role to encompass online safety for all Australians.

We have given the Commissioner powers to address image based abuse, and the Commissioner has since has received over 1,800 reports concerning over 2,500 URLs, and has successfully removed images in around 90 per cent of cases.

The Commissioner has handled increasingly large numbers of illegal and harmful content complaints.  It is on track to complete 12,000 investigations into child sexual abuse material this year alone, an increase of 50% on the previous year.

And Australia has been a strong voice globally for online safety.

In 2013, Australia chaired the discussions that led to the UN General Assembly agreeing for the first time that the law applies online as it does offline.

In 2017, Australia advocated for the inclusion of the online dimensions of tackling violent extremism in the G20 Leaders’ Statement on Countering Terrorism.

And this year, Prime Minister Morrison played a key role in securing a G20 Leaders’ statement calling on the tech industry to do more to prevent the misuse of their platforms by terrorists.

Over the last twenty years, then, governments in Australia and globally have become steadily more engaged in regulating the online environment.  This has meant facing up to several challenges.

The first challenge is that the internet has made it much cheaper, easier and quicker to make the kinds of statements that threaten public safety and public order. People have always been able to make statements which are personally abusive; which are designed to dupe or mislead those who read them; or which provide instructions about how to build a bomb or hijack an aircraft or kill yourself.

But today such a statement made online – at zero cost to the person making it –can be visible to potentially millions, tens of millions, even hundreds of millions or billions of people.  In other words, these harms have always been possible – but the internet greatly magnifies the reach of such harms.

The second challenge is that the internet enables whole classes of new conduct which can harm or endanger others.  Today almost all of us carry smartphones – internet connected devices which can record sound, take pictures, take videos – and communicate that instantaneously to large numbers of people.

The appalling Christchurch Mosque attacks in March this year, for example, were live streamed using Facebook, meaning the devastation and the violence was not limited to the victims and the witnesses, but spread to an enormous audience of viewers across the world. 

This simply could not have happened a few years ago – when bandwidth on mobile networks was not sufficient, and sophisticated live streaming applications did not exist. 

For decades we have had rules about the kind of content people can see on television.  There are carefully designed codes about what may be shown at particular times – and, for example, whether violence may be shown and at what level of intensity.  The idea that live footage of people being murdered could be made available to a viewing audience around the world would have been beyond the comprehension of citizens and governments when the laws which regulate free to air television were devised. 

Our Government acted quickly to address the emergence of this new harm by introducing legislation and working with industry through a dedicated Taskforce to reduce the likelihood of it happening again.

As I mentioned just before, another ‘new harm’ we face is the unauthorised sharing of intimate images – colloquially referred to as ‘revenge porn’, although I think ‘image based abuse’ is a better term. 

Again this is something unimaginable even twenty years ago – but today one in ten Australians has had intimate images shared online without their consent. I recently met with Noelle Martin, WA’s Young Australian of the Year, a survivor turned advocate on the issue of image based abuse.  

It was the work of people like Noelle which led to the Australian Government responding with the image based abuse scheme that has since helped so many Australians.[6] 

Newer behaviours such as cyber-flashing and online incitement of suicide indicate that the harm landscape will only continue to evolve. 

There is a third challenge which the internet presents when it comes to keeping Australians safe. The harms I have described occur on the websites, social media platforms and other online services used by millions of Australians every day. 

In the main these services are based in other countries. So unlike traditional forms of harm, an online perpetrator does not need to be within Australia to impact Australians. This creates significant complexities in establishing and enforcing regulatory frameworks to protect Australians.

If this is a challenge when these services are delivered by legitimate, regulated businesses, it is an even bigger challenge if they are delivered by criminals. Nearly all of the illegal content reported to the eSafety Commissioner originates from platforms hosted overseas.  

In fact, the reason that Julie Inman Grant – who is doing an exceptional job as our eSafety Commissioner – is not here with us today, is because she is representing Australia in Ethiopia at the Global Summit of WeProtect – a significant international forum for cooperation against online child exploitation - of which she is a board member.

This is just one example of how we are working internationally to make the internet a safer place.  At the same time, we have an extensive program of work across many different portfolios within the Morrison Government, to establish clear rules and expectations about online conduct. 

The ACCC’s Digital Platforms Inquiry, for example, looks at the significant competition and consumer issues raised by the dominance of players like Facebook and Google. Before the end of the year we will announce the Government’s response across the 23 recommendations.

The Attorney General is working with his state and territory colleagues to address the responsibilities and liability of digital platforms for defamatory content published online. 

I am working with the Minister for Families and Social Services in taking action to protect Australians against illegal offshore gambling websites, empowering ACMA to work with ISPs on a website blocking scheme - with civil penalties and disruption measures.

 

A new Online Safety Act

But the centrepiece of my work is the new Online Safety Act. Today, I am releasing a consultation paper – designed to gather input as we develop the new Act.

Our plans for the new Act draw in part on the practical day-to-day experience of the dangers Australians are facing today – and what they report to the eSafety Commissioner. 

It might be a female journalist who has written a story about gender equity in Australian sport – and is subject to extreme levels of harassment, abuse and vitriol from a particular individual across multiple online services. Despite contacting the services, the material remains online.

Today, the journalist has no avenues by which to have the seriously harmful material removed, other than contacting the police. Under the proposed Act, the eSafety Commissioner would have the power to have this content removed from the social media services, websites and apps.

It might be a fourteen year old boy being cyberbullied.  The bully creates a video using Twitch, where he says awful, humiliating things about the victim, and is sharing the link with all of his classmates. Today, the young boy would have to endure the bullying for up to 48 hours. Under the proposed Act, the eSafety Commissioner can cut this time in half.

It might be a young woman whose former partner has posted intimate images of her on a revenge pornography site hosted overseas.  Today, while the eSafety Commissioner can issue a notice to the overseas host to remove the content, if the website ignores the notice there is little more the Commissioner can do. Under the proposed new Act, the eSafety Commissioner could issue a notice to Google and Bing to request that the link to the offending page be de-ranked in search results.

Let me describe then the key features of the proposed new Act.

Basic Online Safety Expectations

The new Act will set out what we call the ‘Basic Online Safety Expectations.’ These will draw on the Safety by Design principles, the Online Safety Charter, and the feedback we receive from the consultation process I am kicking off today.

Examples of such expectations include, providing tools and processes to empower users to manage their own safety, actively enforcing terms of use, and improving the transparency of online safety efforts.

The eSafety Commissioner will be able to request that internet companies report regularly on what they are doing to meet these expectations. If a company then fails to report, it will attract a penalty; and if it fails to meet our expectations you can expect the eSafety Commissioner to have something to say about that.

As we committed at this year’s election, the Government will also work with industry to develop additional protections for children. We will be asking industry to see that services marketed to children default to the most restrictive privacy and security settings. We will also work with industry on providing information about parental controls and online safety features at all points in the supply chain for products and services marketed to children.

Cyberbullying Directed Against Children

In this Act we propose to further improve the existing cyberbullying scheme.  While protections are currently limited to social media sites, it will be extended to apply to all of the platforms, games and apps that our children are using online. 

Currently, industry cooperation with our scheme has been very good, with action taken to delete material promptly, following a request by the eSafety Commissioner.  There is scope to do even better – so we want to reduce to 24 hours the timeframe within which platforms must take down material following a request from the eSafety Commissioner.

Stronger Protections Against Image-based abuse

We also intend to strengthen our image-based abuse scheme.

Like the cyberbullying scheme, we plan to shorten, from 48 to 24 hours, the time period within which platforms must remove cyberbullying material following a request from the eSafety Commissioner.

Serious Cyber Abuse Directed Against Adults

Today we have cyberbullying protections for children.  But we are all aware – and I don’t need to convince any of the journalists here today – adults are also too often the targets of serious online abuse.

In June 2017, the eSafety Commissioner’s role was expanded to promote online safety for all Australians. Since then, over 1,500 adults have sought assistance from the eSafety Commissioner in response to serious cyber abuse.[7]

Last year, Amnesty International found that three in ten Australian women had experienced online abuse or harassment. More than half of these victims said the abuse or harassment came from complete strangers. Of concern, 37 per cent of women who had experienced online abuse or harassment said that on at least one occasion these experiences made them feel their physical safety was threatened.[8]

The eSafety Commissioner has no current legislative power to investigate adult cyber abuse. Instead, the eSafety Commissioner has only been able to provide practical advice and relied on the goodwill of social media platforms to help remove material in the most serious cases.

To be clear – this will not mean intervening in everyday personal disputes.

However, there is a strong case for a take-down scheme targeted at the most seriously harmful online conduct, already criminalised in the Criminal Code.

The Government therefore proposes to introduce a new cyber abuse scheme for Australian adults, with its own take-down regime and appropriate civil penalties.

This will help to minimise the harm experienced by victims of online abuse who may not wish to go through criminal proceedings.

However, for those that do, the Government has also committed to strengthening criminal penalties for online abuse and harassment and to address inconsistencies between approaches to criminal cyberbullying across Australia.

Restricting access to harmful online content

The Act will also deal with several weaknesses in our current online content scheme.

Today, the actions that the eSafety Commissioner can take to address prohibited content, such as child exploitation material, vary depending on where the material is hosted.

If prohibited content is hosted in Australia, the eSafety Commissioner can direct the hosting provider to take it down.  However, if the exact same prohibited content is hosted overseas, the eSafety Commissioner’s powers are more limited.

There are also administrative difficulties. The use of the classification system is a clunky and resource intensive way of determining whether content is prohibited or not. It is also ill-suited to assess live-streamed content, which can require a very rapid response.

So we propose to give the eSafety Commissioner new powers to focus on having the most harmful types of content removed, regardless of where it is hosted.

This includes child exploitation material, abhorrent violent material, material that incites terrorism or violence, and other extreme material as determined by a legislative instrument when necessary.

We also intend to seek a stronger role for the Australian internet industry in protecting Australians from access to harmful online content through a reinvigorated arrangement for industry codes.

We intend to maintain the current restrictions on prohibited online content, which currently includes X18+ content and R18+ and some MA15+ material that is not age-restricted.

However, through new industry code arrangements, there would be a stronger requirement for industry measures to protect users from exposure to this type of content and to expedite remedial action.

The Government remains concerned with the ease with which children can access pornography and other types of harmful online content. The codes will require industry to provide their customers with optional products that limit exposure to prohibited online content in their homes.

The eSafety Commissioner will have a central role in these arrangements.  The Commissioner would need to approve the codes before they came into effect, and would investigate code breaches.

If sectors of the industry cannot agree a code or codes, or if the codes prove to be ineffective, the eSafety Commissioner would have the power to develop industry standards.

To make sure the eSafety Commissioner can move quickly, it will have the power to assess content without having to engage the Classification Board.

Ancillary Service Provider Scheme

An important aspect of the new Act will be to give the eSafety Commissioner new powers to work with a wider range of players to get harmful online content down quickly. 

Let me give an example. Last year, an investigation by TechCrunch found that several third-party apps that enabled users to find groups dedicated to sharing child exploitation material were available on Google Play. [9]

Going to the operators of these apps is one way for the eSafety Commissioner to get the material down.  But another way is to ask Google Play to remove the apps.  In last year’s case, Google Play removed at least six of these apps in the wake of the report. 

Google Play is an example of what we are calling in the new Act an ‘ancillary service provider’. Although it does not host the content itself, it provides, or facilitates access, to services that in some cases may host harmful online content.

While the eSafety Commissioner has an extremely high success rate in having intimate images taken down at the source, there are rogue websites that do not comply with these requests.

Thankfully Google has excluded ‘revenge porn’ from its internet searches, due to the harmful nature of this type of content.[10] However this approach is not yet universal - with concerning reports in January that Microsoft’s Bing search engine could be used to find, and suggest, child exploitation material. [11]

These developments illustrate the potential for third-parties, or ancillary service providers, to play a greater role in tackling seriously harmful online content hosted offshore.

Under the new Act, we propose that the eSafety Commissioner would be able to request assistance from search aggregator services and digital distribution platforms, to prevent access to seriously harmful material.

For example, imagine that a website that despite requests to remove, systematically and repeatedly allows the posting of cyberbullying, cyber abuse, image-based abuse, or illegal material. The eSafety Commissioner might then ask a search engine to de-list or de-rank that website.

Blocking arrangements

Finally, we propose to give the eSafety Commissioner the power to direct internet service providers to block domains that contain terrorist or extreme violent material quickly, during an online crisis event such as the Christchurch attacks.

This was a recommendation of the Australian Taskforce to Combat Terrorist and Extreme Violent Material Online which was agreed to by industry and the Government. Such an arrangement would contribute to efforts to deny terrorists the ability and incentive to spread their propaganda and incite further violence and acts of hate.

To facilitate the blocks, the eSafety Commissioner would be able to issue voluntary notices to ISPs. This would allow the eSafety Commissioner to act quickly in responding to an online crisis event.

The voluntary notice scheme would be backed up with power for the eSafety Commissioner to require action by ISPs. Any such mandatory notices would be subject to appeals and transparency mechanisms to provide appropriate oversight of the exercise of this power by the eSafety Commissioner.

Of course, the use of the power would be strictly limited to dealing with online crisis events such as terrorist attacks, or extreme violent material.

 

Conclusion

I have been working in this industry long enough to know that some of what I have just outlined will make some industry representatives uncomfortable.

But what I have outlined is the next phase of the collaboration between Government and industry to maintain Australians’ confidence in the online world - confidence that the internet is a remarkable resource for good that they and their children can safely embrace into every part of their life.

To make sure we succeed in this effort, I strongly encourage those with an interest in online safety to contribute to our consultation process.

I am confident that through this process we can make continue to the internet a safer place for Australians.

 

--------------------

[1] http://www.roymorgan.com/findings/7665-time-spent-working-and-media-march-2018-201807200811

[2]https://www.communications.gov.au/sites/default/files/submissions/Australian_Interactive_Media_Industry_Association.pdf

[3] Quoted in http://techliberation.com/2009/08/12/cyber-libertarianism-the-case-for-real-internet-freedom/, downloaded 22/4/14

[4] https://ourworldindata.org/internet

[5] https://ourworldindata.org/internet

[6] ttps://www.rmit.edu.au/news/all-news/2017/may/not-just-_revenge-porn--image-based-abuse-hits-1-in-5-australian

[7] Oct 2019 eSafety Supplementary Budget Estimates Brief

[8] https://www.amnesty.org.au/australia-poll-reveals-alarming-impact-online-abuse-women/

[9] https://techcrunch.com/2018/12/27/funding-filth/

[10] https://www.theguardian.com/technology/2015/jun/20/google-excludes-revenge-porn-internet-searches

[11] https://techcrunch.com/2019/01/10/unsafe-search/