Tuesday, 22 June 2021

Oral Submission on the FVPCA Amendment Bill (2021)

Below is the prepared part of my oral submission to the Governance and Administration Select Committee on a bill to amend the Films, Videos, and Publications Classification Act 1993. The amendments stem from the Christchurch mosque attacks in 2019, where the terrorist livestreamed his actions, subsequently leading to debate about our censorship laws and a need to update them in the digital era. Cabinet subsequently decided to remove the internet filter aspects of the bill after near-unanimous opposition.

Tēnā koutou katoa, ko Andrew Chen tōkū ingoa, he Paewai Rangahau ahau i Koi Tū i Te Whare Wānganga o Tāmaki Makaurau, nō reira, tēnā koutou katoa. Hello, my name is Dr. Andrew Chen, I am a Research Fellow with Koi Tū: The Centre for Informed Futures at The University of Auckland. My research area is in digital technology and ethics, particularly with regards to surveillance and privacy, as well as public sector use of algorithms and other automated technologies. Thank you for the opportunity to submit on this Bill. 

This Bill is being introduced in the context of the Christchurch mosque attack livestream, which was a reprehensible and harmful piece of content, and I agree that we need to do something to mitigate the harms of such content in the future. However, I think any legislation we introduce needs to be suitable for other, broader contexts as well. I will briefly provide my view on each of the main aspects of the proposed legislation, and then I am happy to answer any questions from the Committee. 

Firstly, on making livestreaming of objectionable content a criminal offence (124AB). In my opinion, this is already effectively covered in the existing legislation, but I don’t see much harm in including this as proposed. I do endorse the recommendation from Brainbox in their written submission, that it be emphasised that the offence is restricted to parties that know or have reasonable cause to believe that the livestream is objectionable, rather than bystanders who may not know what is going to happen and won’t have that intent. 

Secondly, on interim classification assessments (22A-22D). I think this is generally okay, but still requires some justification to provide context for the decision and avoid creating a void that can be filled by mis and disinformation within the 20 working days it takes for a full decision. 

Thirdly, the take-down notices, civil pecuniary penalties, and removing safe harbour (Part 7A, 119C-119K) generally make sense, and I am comfortable with the proposals from a harm perspective. However, I do note that as written, the legislation will have difficulty covering derivative material, where enough change has been made to the material that it can be argued to be a new piece of content, even though it may still carry similar harm to the original material. I believe that this is a challenge that the classification office already faces, and increasingly so with modern digital technologies that make it easy for people to edit content. I also note challenges around enforceability with overseas content hosts.

Lastly, the future mechanism for blocking or filtering objectionable online content, otherwise known as the internet filter (Part 7A, 119L-119O). I think everyone agrees that reducing accessibility to extremist online content is a good thing. But I don’t think we can have a debate about whether this internet filter is good or bad, because we don’t know how the filter would work. The mechanism is insufficiently described in this Bill to give us confidence that it will be effective or appropriate. The explanatory notes in the legislation do provide some hints around the intent, but the legislation itself, as written, does not provide me with confidence that the filter would be effective at achieving the stated goals.

Furthermore, social licence issues have not been sufficiently mitigated. Power is being taken away from Parliament and the people, and given to the Department of Internal Affairs to design, develop, approve, implement, and operationalise the internet filter. The legislation proposes review and challenge processes, but doesn’t actually define these and leaves it delegated to regulations, so we have no confidence right now that a poorly designed filter could be stopped by the people either. DIA should be designing and consulting on a proposed filter, and then if the decision is to implement it, that should be fully described in legislation and should go through a full legislative process.

In my previous submission to DIA when they consulted on these changes in late 2019, I actually suggested that they take out the internet filter part of the Bill because it was significantly different in impact and scale to the other proposed changes. I still stand by that, and I think the committee will find that many of the submissions are about the internet filter rather than the other parts of the legislation. I believe it would be unfortunate if the other positive policy proposals in this Bill were to fail to be implemented just because they were introduced in the same piece of legislation as the internet filter. I agree with the overall intent and sentiment of this legislation, but an internet filter requires far more consideration, development, and consultation to allow the system to be designed in a way that best suits the needs of all New Zealanders. Ngā mihi nui mo te akoako, thank you for your time.