Thursday 18 February 2021

Written Submission on the FVPCA Amendment Bill (2021)

The Government has proposed a bill to amend the Films, Videos, and Publications Classification Act 1993. The amendments stem from the Christchurch mosque attacks in 2019, where the terrorist livestreamed his actions, subsequently leading to debate about our censorship laws and a need to update them in the digital era. Below is my submission to the Governance and Administration Select Committee - it is a submission on 18 February 2021, and I might learn more afterwards that means it's worth making an amendment, or contributing to another submission (such as one from Koi Tū as a research centre). This submission is in part based on my previous submission directly to the DIA when they were consulting on these proposed changes.


1. Thank you for the opportunity to provide a submission. I am currently a Research Fellow with Koi Tū: The Centre for Informed Futures at The University of Auckland. My research area is in digital transformation and its impacts on society, with a focus on government use of digital technologies and privacy. I also have a technical background in Computer Systems Engineering, with research in artificial intelligence, computer vision, and robotics. As a part of the civil society that is interested in digital technology issues, I have had many conversations with colleagues about the sorts of challenges that this Bill seeks to address, and am also informed by the international academic literature. The views in this submission are my own and may not reflect those of my employers.

2. While I am glad that this Bill has reached the Select Committee stage, I am disappointed that it has taken this long to reach this stage of the legislative agenda. The events of March 15, 2019 in Christchurch were evil and deplorable, and it is unacceptable that it has taken this long to take even minor regulatory action in this space. As technology continues to advance at a rapid pace, the gap created between new harms and legislative or regulatory mitigations is growing. There is a general weakness in the New Zealand approach to digital oversight, and an adaptive approach to fast-moving technology is needed. While this point is outside of the scope of this particular Bill, I urge members of the Select Committee to consider how Parliament can be better placed to address these types of issues and mitigate the harms that are appearing, or already present, in our society.

3. I attended a workshop in Wellington held by DIA in November 2019, where the proposals manifested in this Bill were consulted upon. Subsequently, I provided a written submission to their consultation process (which is available at This submission is, in part, based on that previous submission. I also noted in that submission that the work done up to that point should have been a stepping point to further consultation, and encouraged DIA to broaden the conversation as well as to take the opportunity to educate more people on the role of the Censor to help build social license for the changes proposed. I believe that the Select Committee should also be encouraged to actively consult and hear perspectives from a variety of communities, not just those who have the capacity to engage with the traditional Select Committee submission process.

Summaries for Interim Classification Decisions

4. The proposal for an interim classification decision is a practical solution to allow the Chief Censor to act quickly and reduce harm. A timeframe of 20 working days should be sufficient for the Chief Censor and their office to compose a formal decision that can be used as legal precedent and stand up to legal tests in court. However, I note a concern that the Bill, as drafted, only requires that a written notice be provided that an interim decision has been made, and that no justification or detail is required at all. In the fast-moving media landscape, the absence of official messaging can create a void to be filled by misinformation and disinformation about the interim decision. Without any justification for 20 working days, there is ample time for those opposing the interim decision to create their own narrative that may make subsequent enforcement more challenging.

5. I recommend that the Select Committee consider adding to the proposed 22B, requiring that an interim classification assessment be accompanied by a summary that gives a short description of the harms that the Chief Censor is trying to mitigate and how the content may lead to those harms. It should be made clear that this summary is not final, and does not create a legal precedent until the full written decision is released (as is implied in the proposed 22A(6)). Making this clear in the legislation would also help provide confidence that this power should not be misused, as even interim decisions would need to be justified in some way, even if it is only temporary and subsequently not converted to a full classification decision.

Take-down Orders: Complex Businesses, Derivative Material, and International Enforcement

6. The proposed Part 7A deals with take-down orders to online content hosts for content that has been deemed objectionable, with penalties for non-compliance. These proposals generally make sense, and I am pleased to see that the Bill as drafted uses a definition for “online content host” that is largely consistent with the existing definition provided in the Harmful Digital Communications Act 2015. Members of the Select Committee should be aware of the broader use of this term in other legislation before seeking to amend the definition to include or exclude particular entities. However, it should be noted that there are complex businesses in New Zealand that may play multiple roles in dealing with data. For example, it can be difficult to separate out the part of the business that is a “host” from the part of the business that is responsible for the “transmission” of data between computers. It can also be challenging to distinguish “hosts” who themselves are also “publishers”. I would generally be in favour of amending the definition in all relevant legislation to better reflect the reality of these businesses.

7. I would also like to raise a concern around derivative material. If a publication is declared to be objectionable, it may be easy for those with malicious intent to subtly change the content such that it could be argued that it is materially different to the objectionable publication and therefore legal. For example, a video with a replaced audio track could be argued to be sufficiently different to the original video and need to be considered a new piece of content. Another example could be if a video of a real-life incident is recreated as a cartoon animation, which is effectively a different medium. Still images from a video should also be included in the same classification decision. At a technical level, it is relatively trivial to modify content in a way that makes it difficult for filters or search engines to detect automatically. The Chief Censor should not have to issue decisions on each derivative piece of content, or they could become quickly overwhelmed. The legislation could be amended to cover derivative content that would contravene the spirit of a decision from the Censor, subject to appeals. However, this is an issue that should be very carefully considered, as there may be freedom of speech considerations as well, particularly in satire or media reporting. This may need to be considered in a longer-term review of the media and censorship landscape.

8. There is also a concern about overseas or multinational online content hosts and the enforceability of the proposed legislation on these entities. 119B(c) specifies that Part 7A applies to “online content hosts both in New Zealand and overseas that provide services to the public.” As has been seen in the operation of the Harmful Digital Communications Act 2015, it can be very challenging to deal with cross-border infringement. Some online content hosts overseas are likely to ignore take-down requests (especially if they are knowingly hosting the objectionable material), and it may be practically impossible to enforce a penalty against them. I do not believe that this is an issue that can be resolved in this legislation alone, but Members of Parliament should be aware that even though the legislation expresses extraterritorial application, that may not eventuate in reality.

Incomplete Future Mechanism for Blocking for Filtering Objectionable Online Content

9. The Bill also contains steps towards the establishment of a web filter at the ISP level for objectionable online content. This is frequently compared to the existing voluntary filter for child exploitation material. However, the notion of a hard web filter is controversial and sets up a completely different debate to the other proposed amendments to the Act. Fears of scope or mission creep would be well justified in this case – the idea that (primarily violent) objectionable content could be added to a child exploitation filter is in itself scope creep. The Bill leaves significant questions unanswered about who will have control over the filter and how content will enter the filter, as well as technical concerns around blacklisting specific web pages when the objectionable content itself is likely to replicate across many web pages. It is not possible for experts and members of the public to give informed opinions about whether or not the proposed filter is a good idea or not because it lacks the necessary detail. This detail is necessary before we can properly contextualise and consider broader debates about the impact of a web filter on freedom of expression and mitigating harm.

10. It is therefore extremely concerning that the Bill, as drafted, allows the Secretary for Internal Affairs to consult on the design of the filter, and then allows them to give approval and operationalise the system without further legislation or review by Parliament. The filter could be introduced without the broad and active input that is necessary to achieve the necessary social licence for this controversial system. For this reason alone, this Bill needs to be significantly amended, or it should not pass through the House. It is not acceptable that the design and development of this system could happen without further oversight or direction from Members of Parliament and the associated public submission processes, or that the public would likely have to rely on the Official Information Act to get the information needed to understand and critically analyse the system. This process has further highlighted the problematic nature of our current consultation methods when applied towards addressing complex techno-social issues. If this Bill passes in its current form and the filtering system is established, it is likely that the proposed 149(ah) would immediately be called upon by civil society to review and challenge that system, yet there is no guarantee that the necessary regulations would be in place by the time that the system is operational. That these review and appeal processes are delegated to regulations that have not been made available is injurious to the democratic process.

11. In my previous submission to the DIA, I actually suggested that no legislative changes be made associated with the web filter proposal precisely because it was significantly different in impact and scale to the other proposed changes. A web filter is a significant policy proposal in the digital age that requires much more consideration, development, and consultation with a clear timeline that gives people at least a few years to react and respond rationally, to allow the system to be designed in a way that best suits the needs of all New Zealanders. There is no quick fix available to make the proposed legislation more palatable. This is not a minor missing detail; it is the entire framework for the system that has been left undefined. I recommend that the sections relating to the proposed “electronic system” be removed by the Select Committee, so that the other, more reasonable, amendments can still be supported and implemented. It would be unfortunate if positive policy proposals like the interim classification decision were to fail to be introduced simply because they were grouped together with a web filter proposal.

12. Thank you again for the opportunity to make a submission to this Bill. I would be happy to make an oral submission, and understand that all submissions will be available publicly.

Warm Regards,
Dr. Andrew Chen
Research Fellow, Koi Tū: The Centre for Informed Futures
The University of Auckland

No comments:

Post a Comment