As the horrific violence of last week spilled online and worked its way through too many people’s social media feeds, attention has turned towards the need to refine our online safety legislation.

The nation’s eSafety Commissioner is still negotiating with X and Meta to pull graphic images from Bondi Junction and Wakely from the internet. And while Meta is said to be co-operating, X owner Elon Musk called the commissioner a “censorship commissar” after she issued a take-down order on violent content relating to the Bondi Junction and Wakeley attacks, and X labelled the order “unlawful and dangerous” in a statement, saying the content does not breach its user guidelines.

X owner Elon Musk.Credit: Reuters

Outside the violent footage of the respective Sydney stabbings, X is awash with pornography, Facebook is filled with AI “trash”, and TikTok is repeatedly found to “algorithmically supercharge” anxiety. It feels like social media is deteriorating at rapid speed. But this content is a symptom of a broader problem.

Social media is getting worse in large part because companies are stripping cautionary investments and shifting resources away from user safety and user protections. Without serious, legally enforceable incentives, the trend toward safety minimalism will only continue and disturbing footage like that we’ve seen this week will continue to circulate.

The eSafety Commissioner’s focus on content take-downs is like catnip for free-speech champions like X Musk. But any critique he or like-minded users may have over these requests would be better directed to the Online Safety Act and the government’s proposed misinformation bill, which was shelved last year but now looks to be revived.

Both instruments are positive steps forward, but not enough to confront the issue at its root. The act and the bill share elements of an increasingly outdated approach to digital platform regulation, where well-meaning policymakers have carried across principles from traditional broadcasting to digital media distribution that cannot scale, burden the wrong players, and may inadvertently stoke institutional mistrust.

As it currently stands, tech accountability amounts to regulators tailing global multinationals and issuing letters or threats of hefty fines once the harm has already happened. But it can be so much more than this.

Social media companies have a deep knowledge of how their platforms work and access to real-time and granular data on operating conditions. And yet, despite this information asymmetry and capability gap between the tech giants and the government, thanks to the Code of Practice on Disinformation and Misinformation, the industry still enjoys self-regulation and an industry-crafted voluntary code.

QOSHE - For as long as playing ball is optional, horrific violence will remain on social media - Alice Dawkins
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

For as long as playing ball is optional, horrific violence will remain on social media

17 0
21.04.2024

As the horrific violence of last week spilled online and worked its way through too many people’s social media feeds, attention has turned towards the need to refine our online safety legislation.

The nation’s eSafety Commissioner is still negotiating with X and Meta to pull graphic images from Bondi Junction and Wakely from the internet. And while Meta is said to be co-operating, X owner Elon Musk called the commissioner a “censorship commissar” after she issued a take-down order on violent content relating to the Bondi Junction and Wakeley attacks, and X labelled the order “unlawful and dangerous” in a statement, saying the content does not breach its user guidelines.

X........

© The Sydney Morning Herald


Get it on Google Play