Hard News by Russell Brown

117

Media3: Where harm might fall

The Law Commission cannot be accused of failing to consult on its issues paper, Review of Regulatory Gaps and the New Media. Indeed, earlier this year it proactively sought comment from the audience of this site through guest posts entitled Who are the media?Who guards the guardians? and Censorship is not the only enemy of free speech. So why do most commentators seem to think its new paper and proposed Bill gets it wrong, perhaps dangerously so?

It's the last of those guest posts, dealing with harmful speech by citizens on the internet, that flows through into the new Ministerial Briefing Paper, Harmful Digital Communications:  The adequacy of the current sanctions and remedies. It's worth re-reading that post's opening sentences:

Robust communication has been a hallmark of the internet since its inception. Free speech values and an abhorrence of censorship are central to its culture.

However, censorship is not the only enemy of free speech. Those who exercise their free speech to intimidate, bully, denigrate and harass others on the internet lessen the credibility of free speech arguments. In effect, those who exercise their free speech rights to cause harm may inhibit others from participating freely in this vital new public domain. The practical anonymity afforded abusers, and the lack of real-life consequences can create an environment where such abusive behaviour can thrive.

The new ministerial briefing paper, brought forward at the request of Minister Judith Collins, does similarly avow free speech principles, a few pages in. But its introduction emphasises "a  widespread desire that something be done" about "the use of new communication technologies to cause harm" and notes "examples of the most disturbing and damaging communications," between both young people and adults.

I am not of the view that there's simply nothing to be done about personally harmful material on the internet. During the this year's NetHui session on dealing with bad behaviour online, moderated by Judge David Harvey, a series of people averred that once something was online, it was there for good and there was nothing to be gained by removing it.

This is nonsense. If I feel the need to remove something from the Public Address forums because it's defamatory or needlessly offensive (fortunately, this doesn't happen often -- love you all lots), then of course I'm limiting the harm it can cause. On the odd occasion where an older comment, by me or a reader, has been drawn to my attention as untrue or harmful, I've chosen to leave it in place, clearly annotate it as incorrect and trust our Google ranking to impart prominence to the correction.

So it's the not the case that nothing can be done. But me taking action on my own website is quite different to the state, or a state-ordained body, taking action -- and from there being "a new communications offence tailored for digital communication". A new criminal offence, that is.

As barrister John Edwards put it in his blog, "critics are right to be wary of such a vague intrusion into online discourse" as "prohibiting and punishing speech which causes emotional distress". TechLiberty spokesman Thomas Beagle put it more strongly, critcising the Commission apparent belief ...

... that the law should forbid offensive speech that has only got as far as causing someone "significant emotional distress", a rather low bar when adolescents or other excitable people are involved. (The Commission acknowledges that this goes beyond the current bounds of NZ criminal and civil law.)

I don't think it's necessary to set about characterising victims of bullying as "adolescents or other excitable people". The Commission has clearly taken seriously the submission from NetSafe (which says it deals with an average 75 complaints of internet bullying a month) and advice from the police, who say they are dealing with more complaints of internet and other communications bullying.

It only takes a little empathy to substitute the word "excitable" with the word "vulnerable" and consider how traumatic the circulation of private material, true or false, or other forms of media (photoshopped porn, say) might be for some people. I try to bear in mind that while I have fielded some grossly offensive communications in my time, I have the standing and skill to respond if need be. I'm not that vulnerable.

There is also, I think, a difference between offence in the public world of well-known blogs and the closer, more personal communications environments of Facebook and text messaging, where actual threats of violence take on a rather different tone. (Although it's worth noting that, to take the most obvious and egregious example, Cameron "Whaleoil" Slater has published vile, untrue sexual innuendo and created photoshopped porn of people with whom he disagrees.)

I'm much more open to the general idea of the proposed Communications Tribunal, especially in the area of the finding of fact. A suitably SEO-enhanced finding seems a more practical and accessible solution than expecting individuals to try and clear their name via defamation action. Indeed, as Edwards points out in his blog, the Tribunal would have access to a range of orders, which can be made against the defendant, and ISP, a "website host", or any other person:

(a) an order requiring that material specified in the order be taken down from any electronic media:

(b) an order to cease publishing the same, or substantially similar, communications in the future:

(c) an order not to encourage any other person to engage in similar communications with the complainant:

(d) a declaration that a communication breaches a communication principle:

(e) an order requiring that a factually incorrect statement in a communication be corrected:

(f) an order that the complainant be given a right of reply:

(g) an order to apologise to the complainant:

(h) an order requiring that the author of a particular communication be identified.

There are some obvious practical difficulties in enforcing these orders -- and we're relying on the Tribunal, which would be chaired by a District Court judge, to sensibly apply the proposed 10 Communications Principles:

Principle 1

A communication should not disclose sensitive personal facts about an individual.

Principle 2

A communication should not be threatening, intimidating, or menacing.

Principle 3

A communication should not be grossly offensive to a reasonable person in the complainant’s position.

Principle 4

A communication should not be indecent or obscene.

Principle 5

A communication should not be part of a pattern of conduct that constitutes harassment.

Principle 6

A communication should not make a false allegation.

Principle 7

A communication should not contain a matter that is published in breach of confidence.

Principle 8

A communication should not incite or encourage anyone to send a message to a person with the intention of causing that person harm.

Principle 9

A communication should not incite or encourage another person to commit suicide.

Principle 10

A communication should not denigrate a person by reason of his or her colour, race, ethnic or national origins, religion, ethical belief, gender, sexual orientation, or disability.

It's hard not to look at those principles and think that haters be sayin' much of that stuff every day -- and that perceptions of what is "grossly offensive" might vary very widely. It's easy to see how the proceedings of the Tribunal could be used to merely shut down unwanted criticism. It's fortunate, then, that the Commission recommends an intermediary stage, where an "approved agency" (Netsafe, everyone seems to think) can undertake mediation. David Farrar welcomes that detail by observing that "mediation is preferable to arbitration."

Beagle also makes the following observation:

We are also concerned when it is proposed to make something illegal on the internet that wouldn't be illegal if it was published in some other way. Does it really make sense that the same message might be legal on a billboard in the middle of Auckland but illegal if it was then posted to the Trademe Forums? As we say in our founding principles, "We believe that our civil liberties don't just disappear when using the internet."

There are actually some clear practical differences. It's vastly easier to go to an outdoor advertising company and complain you're being defamed on a billboard than to remonstrate with a vindictive internet commenter. And a billboard doesn't replicate all over town the way an offensive communication might replicate across the internet.

But as Judge Harvey memorably and somewhat fatefully observed: "The problem is not technology, the problem is behaviour. We have met the enemy and he is us."

In the end, we are dealing with people. That's what makes it difficult. And that's what should make us very cautious in proposing solutions.

We're discussing the issue further on this week's Media3, where John Edwards and Thomas Beagle will be joining me. There'll also be a chat with lawyer Tracey Walker about her new book Reputation Matters: A Practical Legal Guide to Managing Reputation Risk, and a look at a fascinating new take on last year's English riots.

If you'd like to join us for tomorrow's recording, we'll need you to come to the Villa Dalmacija ballroom, 10 New North Road, Auckland (it's just down from the corner with Mt Eden Road) around 5.30pm.

117 responses to this post

First ←Older Page 1 2 3 4 5 Newer→ Last

First ←Older Page 1 2 3 4 5 Newer→ Last

Post your response…

This topic is closed.