I was honoured to join a recent study tour led by the Minister for Internal Affairs Jan Tinetti as part of New Zealand’s review into content regulation.
The Helen Clark Foundation has worked on issues related to online harm since we launched in 2019. One of our first major projects looked at the New Zealand landscape for social media regulation, and then we expanded on this for an international audience at the 2019 Paris Peace Forum with our Christchurch Principles Project.
The global picture has evolved significantly in the meantime. Both Britain and the European Union are poised to pass major legislation which will regulate social media platforms. As New Zealand works out how we will manage our regulations, one thing is clear – the more aligned we are with the rest of the world, the better our chances of enforcing the new rules are. Social media platforms are goliaths – many would argue they operate effectively as global monopolies – and New Zealand through both strong diplomacy led by PM Jacinda Ardern and moral authority arising from the response to the Christchurch terror attacks has a unique influence on this debate. Everyone agrees that the platforms got too big, too fast, and accepted too little responsibility. But the question is now – so what? How can we practically set regulations for companies that operate on this scale? For example 1.6 billion people are estimated to use Facebook each day.
It was in this spirit that the study tour was launched, so that New Zealand regulators, government decision makers, and civil society could jointly understand the frontline of this issue at a critical time while legislation that shapes the global landscape is being written. We visited Ireland, the UK, France and Finland during a fast paced ten day visit.
Go raibh míle maith agaibh / Ngā mihi nui ki a koe to our New Zealand colleagues for joining us today, it was a pleasure having you. Mā te wā! @IrelaninNZ @BSA_NZ @TohatohaNZ @HelenClarkFound @netsafeNZ @NZOFLC @BradBurgessNZ @jantinetti @BSA_NZ pic.twitter.com/jEtxyNPtVl
— Coimisiún na Meán (@CNaM_ie) October 3, 2022
There are too many takeaways to cover in depth in this article, however a few things stood out to me.
CONTENT NOTE: this section discusses suicide and may be upsetting. If you need help, please contact:
Need to talk? (1737, free text or call)
The Depression Helpline (0800 111 757)
Lifeline (0800 543 354)
Samaritans (0800 726 666)
Youthline (0800 376 633)
Firstly, the Molly Russell case in the United Kingdom has led for the first time to a coroner finding that social media played a material role in the suicide of a young person. Platform algorithms work to recommend users more of what they seem interested in, and they accept no liability for what it is that they are recommending. In the Molly Russell case, a young woman suffering from anxiety and depression was recommended endless content which served to encourage despair and normalise suicide. Her family has released much of her social media feed to the media so that the public can better understand what life can be like for depressed young people on the internet. This story from the BBC contains many of the key facts. Please note, it can be a really upsetting read, and even though it is an important issue to understand you do not need to read it if you do not feel safe to do so.
This case has served to educate the UK public about what is meant by ‘lawful but awful’ content. Much of what Molly viewed was not strictly illegal, however it should have been banned under Instagram terms of service. Yet these terms are unevenly enforced, and there is limited ability for regulators to even know how effective enforcement action is. There is good reason to believe that terms of service are more theoretical goals than actual consumer protections.
Similarly, what content is deemed removable is currently decided by the platforms on their own, rather than by more democratic and representative bodies. The British online safety bill seeks to shift this defining power to parliament. Many of the changes that this bill brings will be rolled out globally, because it is difficult for social media companies to know exactly where their users are.
Secondly, the EU is also moving ahead on a Digital Safety Act (DSA). Under the DSA, providers of intermediary services – including social media, online marketplaces, very large online platforms and very large online search engines – will be forced into greater transparency, and will also be held accountable for their role in disseminating illegal and harmful content online.
Combined, these two pieces of legislation will shift the playing field for content regulation globally. New Zealand has played a major part in driving these debates forward through the Christchurch Call and other diplomacy led by PM Ardern. The soft power that Jacinda Ardern has successfully generated for our small country will be one of my lasting impressions from this visit.
Read more