In the wake of the March 2019 Christchurch terrorist attack, which was live-streamed in an explicit attempt to foster support for white supremacist beliefs, it is clear that there is a problem with regard to regulating and moderating abhorrent content on social media. Both governments and social media companies could do more.
This paper discusses the following issues in relation to what can be done to address this in a New Zealand context; touching on what content contributes to terrorist attacks, the legal status of that content, the moderation or policing of communities that give rise to it, the technical capacities of companies and police to identify and prevent the spread of that content, and where the responsibilities for all of this fall - with government, police, social media companies and individuals.
Key recommendations for the NZ Government:
Direct the New Zealand Law Commission to review regulation of social media. The current legislative landscape is a patchwork of legislation much of which predates social media.
Establish an independent regulatory body to oversee social media companies in New Zealand. The New Zealand Media Council and the Broadcasting Standards authority provide a basis for how such an agency could be structured.
Impose a statutory duty of care on social media companies. Social media companies would need to invest in and take reasonable measures to prevent harm by, for example, improving their technology-based responses or make changes to their terms of service; otherwise they would face penalties from a regulatory body mandated to oversee and monitor online harms.
Carefully considers how hate speech and hate crimes are currently protected and prosecuted against under New Zealand law.
Meet with social media companies operating in New Zealand to agree on an interim plan of action, similar to the EU’s Code of Conduct on Countering Illegal Hate Speech Online or the UK’s Digital Charter
Direct New Zealand’s intelligence services to develop a high-level strategy outlining their commitments to combatting white supremacist and far-right extremism and what steps they will take to prioritise this issue, and make this document public.
Continue to champion the issue of social media governance at the global level – such as the ‘Christchurch Call’ Summit in May 2019 - to ensure a multi-jurisdictional approach to addressing the spread of terrorist and harmful content online is prioritised.