Legal Beagle by Graeme Edgeler

3

A submission on the Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill

Below, for those interested, I copy my submission on the Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill.

This is the government bill aiming to create a mandatory Internet filter. The bill is largely unnecessary, but in parts not as bad as people fear, although this is because the current law already has many of the problems they're ascribing to eg the new offence of livestreaming objectionable content.

The Governance and Administration Committee

Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill

Submission of Graeme Edgeler

Introduction

My name is Graeme Edgeler. I am a Wellington barrister with a strong interest in censorship and issues of freedom of speech.

I thank the committee for the opportunity to make a written submission on the Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill, and look forward to appearing in person to supplement it.

I consider parts of the bill unnecessary, and the bill somewhat of a missed opportunity: there are a number of issues with New Zealand’s censorship laws that are in need of major reform, which this bill does not seek to address.

I am aware of other criticisms of the bill, some of which are misplaced. Many of these criticisms are really criticisms of the current law. I have not addressed these matters (for example, the livestreaming of non-objectionable material that becomes objectionable) because the Bill does not change the law around this. The Committee should nevertheless consider making recommendations around fixing these concerns.

I address the following issues in my submission:

  • Livestreaming objectionable content.
  • The disapplication of the Harmful Digital Communications Act.
  • The creation of a mandatory Internet filter.
  • A few relatively minor comments on statutory language.

New section 124AB offence to livestream objectionable content

New section 124AB would create an offence to livestream objectionable content.

This new offence is unnecessary. It is already an offence to livestream objectionable content. Anyone livestreaming an objectionable video commits the offence of making it, as well as supplying it, and distributing it.

Section 123(4) of the Films, Videos and Publications Classification Act was amended in 2005 to make clear that electronic transmission was covered. If there is any serious doubt that the words used are sufficiently clear (and I do not believe there is), then:

(i) Ministry advisers will presumably be able to point to the court case in which a person was acquitted on this technicality, despite Parliament’s 2005 amendment; and

(ii) The appropriate legislative response is keeping the new definition of livestreaming and adding the word “livestream” as an alternative in section 123.

I note also that the livestreaming offence appears intended to only cover publishers, and not those watching a livestream (either way, this should be made clear). People watching a livestream can currently be charged, the creation of this offence in this way, may suggest to the Courts that Parliament intends that those watching, for example, livestreamed images of child sexual abuse ought not to be charged.

Disapplication of the Harmful Digital Communications Act safe harbour provisions

When Parliament adopted the Harmful Digital Communications Act (“HDCA”) is put a deliberately wide immunity provision in it, providing not just a defence, but a broad immunity that prevented online content hosts from even being proceeded against.

This was an appropriate response: online content hosts should not be expected to screen every picture or video uploaded, or email sent: you should prosecute the person who wrote the death threat, not the mailman who delivered it.

Given the desirability of this approach, it is not clear why this legislation needs to be excluded from the protections in the HDCA, which will continue to apply, for example, to privacy claims, defamation, and prosecutions for things like death threats, and uploading intimate visual recordings.

While the removal of objectionable content is more serious than unwanted intimate visual recordings, the differences between them are not such that one would expect different approaches to be taken with blameless content hosts of either sort of illegal content.

Given that there is dual responsibility for enforcement of laws against objectionable material (by both DIA and the Police, and Customs), it is difficult to see why different immunities should apply depending on who was bringing the charge: the defence in new section 119G when Police lay the charge under section 124 of the Films, Videos, and Publications Classification Act or the immunity in section 23-25 of the HDCA when Customs charge fundamentally identical behaviour under section 390 of the Customs Act.

In addition, the statutory immunity is insufficiently broad. The HDCA immunity applies whenever an online content host receives notice of a complaint about content they host, which does not need to be in a particular form. The section 119G defence only applies when there is a take-down notice. If no take-down notice is issued, but instead a prosecution of the online content host is commenced, the defence would not apply. Whether a separate immunity/defence is needed or not, the defence that online content hosts can rely on should arise from the fact they are not expected to censor all the content they host, not because some action has or has not been taken by an enforcement agency. This concern also arises with respect to the section 22D defence in respect of interim classification assessments.

If there is a problem with overbreadth of the immunity provision in the HDCA, then it should be fixed for all content covered by that immunity, not just objectionable or restricted content.

And if there are concerns that the liability shield in sections 23-25 of the HDCA would prevent enforcement of new takedown orders, then an amendment to section 25(5) of the HDCA is the better way to meet that aim.

The Mandatory Internet Filter

The explanatory note of the bill records that “The Bill facilitates the establishment of a government-backed (either mandatory or voluntary) web filter if one is desired in the future. It provides the Government with explicit statutory authority to explore and implement such mechanisms through regulations, following consultation.”

This is backwards. Parliament is being asked to approve a government-backed web filter, despite one “not being desired” at this time. It should refuse.

Parliament should not abandon its role in deciding so important a public policy question to the Secretary for Internal Affairs. If Parliament wishes there to be a mandatory Internet filter, it should recommend to the Government that it bring forth a proposal for one, with much more detail than is contained here included in the primary legislation. It should then decide whether the safeguards are sufficient, and the costs worth it, and either approve it or reject it.

If the Government has yet to make up its mind over whether it wants a filter, or what type of filter it wants if it does, then it should go away and do the work before seeking any necessary legislative change. If there are concerns that the Government could not do the preparatory work on a possible filter without a law change, the law should could be limited to permitting that.

Ultimately, however, the final decision over whether to have a mandatory Internet filter should be made by Parliament. The balancing exercise is one it should weigh.

If you are determined to press ahead with a proposal to allow the Secretary at some future time to impose a mandatory Internet filter, at the very least, the legislative instrument creating it should be a confirmable instrument, included in schedule 2 of the Legislation Act 2012, and subject to confirmation by the House of Representatives.

The decision of whether a mandatory web filter should be imposed is one for Parliament. It is should abandon its role in favour of the Secretary for Internal Affairs. If the Government decides in the future that it wishes to have, it should bring its rationale.

Proposed Changes to the Legislative Language

I have a few relatively small suggestions about some of the current wording choices adopted in the legislation, which I invite the Committee to seek advice over.

New Section 22B

New Section 22B(5) provides that the provisions of the Act (including specified offence provisions) apply to publications subject to interim classifications. This section is unnecessary. Prosecution under the Act does not turn on when a publication is classified, but on the nature of the publication itself. It also appears to imply that a prosecution could be commenced for eg possession of a publication subject to an interim classification that is ultimately not determined to be objectionable. This should not be the case. The subsection should be removed.

New Section 119A

The definition of livestream includes the words “transmit or stream”, stream can be used to describe both the actions of the person publishing the images, but also anyone receiving them. Whichever meaning is intended, some additional clarity would be welcome.

New Section 119K

I see no reason why the secretary should not also be required to make publicly available a list of take-down notices issues that have not been complied with.

Conclusion

I thank the committee for the opportunity of presenting a submission. If the Committee considers that the Bill should pass, it should recommend that a number of amendments, in particular, removing the legislative authority it grants the Secretary of Internal Affairs to create a mandatory Internet filter, and bolstering the statutory defences applying to online content hosts.

                                                                                                     Graeme Edgeler

3 responses to this post

Post your response…

Please sign in using your Public Address credentials…

Login

You may also create an account or retrieve your password.