Tue, 07 Dec 2021 - 12:35
Viewed

Facebook is on notice — keep our kids safe

Recent revelations in the United States regarding Facebook’s failure to protect young users of its platforms aroused worldwide ­concern.

But here in Australia, where we have been consistently ahead of the international curve on online safety, they were hardly surprising.

Facebook is a global giant, used by billions of people, with enormous influence and power. But the company still has far to go before it accepts and lives up to the responsibility that goes with its scale.

very day there are people bullied on Facebook. Every day there are people choosing not to get vaccinated because they have been misled by misinformation on Facebook.

Every day there are people defrauded by financial scammers advertising on Facebook. Every day there are teenagers driven to despair by a flood of unrealistic images of bodily perfection on Instagram.

Take just one example of Facebook’s colossal failure of responsibility: the 2019 Christchurch, New Zealand mosque attack when the murder of more than 50 people was livestreamed on Facebook.

For a traditional television station to have done this would have been an egregious breach of the rules set by broadcasting regulators in most countries. Yet Facebook carried out this horrendous act with virtually no sanction.

In Australia, over the past eight years we have steadily raised the regulatory bar on social media platforms.

At every stage we have faced resistance from these companies, who publicly say they agree with the objectives, but then tell us the technical requirements are too much, coming too quickly, and too expensive to implement.

They have repeatedly warned Australia would be reduced to a technological backwater if we tried to regulate big tech (it has not).

Or they have told us that they are doing a perfectly good job of keeping users safe on their platform without the need for regulation (they are not).

In 2015, the Liberals and Nationals government established the world’s first office of the eSafety Commissioner – an independent government agency whose purpose is to help safeguard Australians against online harm. This office was granted statutory powers to deal with cyber-bullying material directed at children.

We took this step because there was no practical remedy available to child victims of cyber-bullying and the platforms were not responsive to the needs of their users.

So we made it the law: if you are a social media platform available to Australians, and you receive a take-down notice from the eSafety Commissioner about cyber-bullying content, you must remove the material within 48 hours.

Since then, we have extended eSafety’s powers to cover image-based abuse (colloquially known as “revenge porn”) and, following Christchurch, to cover abhorrent violent material.

And just this year, parliament passed a new Online Safety Act that further strengthens the powers of the eSafety Commissioner.

Once the Act takes effect in January, eSafety will have a new power to order the take-down of serious cyber abuse directed against an adult: the worst and most damaging kinds of trolling.

The House Standing Inquiry we announced last week will be an opportunity to put the safety practices of big tech under the microscope.

Among other issues, the inquiry will examine what the industry is doing to keep children safe, with particular reference to how algorithms influence what children see – and whether they can end up in a spiral, viewing more and more harmful content.

Big tech must explain to Australian parents how it will keep children safe online, and how it will empower parents to make choices about what their children see and when.

And Australian parents must be given a voice to tell big tech they expect their children to be safe online.

This inquiry will help give them that voice.

This article appeared in The Australian on 6 December 2021