Science has a bad habit of asking – demanding, even – to be placed in a position of power. To be referred to as an authority on all things. To be trusted by the public.
Not that this last is a bad thing in itself – indeed, I think it is rather important. But if science is to be trusted by the public, then we scientists need to take that trust seriously. What does it mean for us to insist on a place of privilege for scientific knowledge?
In the last few months, several different occurrences have focused my thinking on this topic.
First there were allegations of misconduct by CRI (Crown Research Institute) scientists at NIWA with respect to the Ruataniwha irrigation scheme. When asked to comment, I was at pains to highlight the different circumstances of those scientists employed at our universities, who have the statutory privilege of academic freedom, and that of our CRI scientists, who work in an environment in which commercial and governmental financial pressures have a much more direct impact. Not that this affects scientific outcomes directly, necessarily – but the uneasy coexistence of public good and commercial research in our CRIs leaves their staff in a situation that is not always straightforwardly navigated. It doesn’t exactly lend itself to the transparency that might assist public understanding, either.
Secondly, the NZ Association of Scientists ran a survey of NZ scientists who were willing to share their experience with the National Science Challenges. The results were far more pointy-ended than I had expected, based on a year of discussions where everyone publicly seemed to agree on the need to make the best of a bad job. It was a lesson in the power of anonymity in giving people a voice – a lesson reinforced by the emails I then received, in particular from CRI scientists who are, as one correspondent reported, “gagged from talking to the media on topics that might seem critical of government policy from 2 months out from the election”.
A third moment of reflection was prompted by the release of the plan for the Science in Society project: A Nation of Curious Minds. This is a really positive initiative, aimed at “developing stronger connections between science and society” and putting “special emphasis on our young people and science education”: a really laudable initiative that has come out of the process behind the National Science Challenges, and I don’t want to come across as critical in the least. Except for just one small thing. It may even be nothing.
One of the actions recommended in this report is that the Royal Society of New Zealand develop a new code of practice for public engagement for scientists. In the fine print, we are given additional clarification that this will pertain to the “social responsibility of science organisations and scientists to engage with the public and policy makers based on their expert knowledge”. Again – this sounds fine. Except – from what I can tell, it seems that we already have this.
The Royal Society of New Zealand has a code of ethics, which has quite a lot to say about the responsibilities of scientists. This became very clear to me in discussions that followed the original Radio NZ story on NIWA and the Ruataniwha matter – so much so, in fact, that in the NZ Association of Scientists submission on the recent National Statement of Science Investment, we recommended that the government should “amend the CRI Act to require that the boards of CRIs support the RSNZ code of ethics”. This seemed a sensible way to avoid creating new pressures on scientists who have – very fairly – a duty to their employer, while addressing issues of public confidence in science. If the RSNZ code of ethics is to be effective, it needs to be part of the scientific consciousness. The issue may only be that of public perception, but that makes it no less serious an issue: public trust in science matters.
The code of ethics has a few things to say on matters of public engagement and the communication of science. In fact, it is explicitly based on the need to maintain public support for “work in the areas of science, technology, and the humanities”. As such, a member is required to:
- strive to be fair and unbiased in all aspects of their research and in their application of their knowledge in science, technology, or the humanities (Rule 2.1(2)b)
- not present themselves as experts outside their areas of expertise (Rule 2.1(2)k)
- only represent themselves as experts in their fields of competence as defined by their formal qualifications or other demonstrable experience (Rule 4.1(2)a)
Our responsibilities to the public who fund our work are made explicit. A member must:
- endeavour to make the results of their work as widely available to the public as possible and to present those results in an honest, straightforward and unbiased manner (Rule 6.1(1))
- accept that researchers working on different approaches to a problem may reach different but supportable conclusions within the context of their own research (Rule 6.2(2)d)
- avoid attempting to influence public policy in situations where the available evidence is contradictory or inconclusive without making the state of that evidence clear (Rule 6.2(2)f)
These rules are there for good reason. They may not be perfect.[i] But do they have failings in their description of the “social responsibility of science organisations and scientists to engage with the public and policy makers based on their expert knowledge”? A case may yet be made, but for the moment, I do not think so.
So why might we need a new code of practice? Where might this be coming from? In the background to all this, we have a Chief Science Adviser to the Prime Minister who warns of the dangers of scientists acting as advocates. While his concerns around the importance of being an honest broker are genuine, they are not without their issues: "we need to confront the tensions between being objective and deploying our judgment."
Perhaps more to the point, the ‘honest broker’ approach does not appear to have made any impact on the PM’s willingness to listen.
In the absence of voices from the CRI community, we may need to look to university academics for examples of where science, engaging with the public and policy makers, might run into problems. The most obvious example is Dr Mike Joy, of Massey University, who was memorably called a traitor to NZ for his comments on our declining water quality. Even more significant were the Prime Minister’s comments to the BBC, where he stated that he could always provide another academic to give a counterview.
This week has provided several reminders of all these events. In fact, all in one day, I came across evidence that the work of Dr Joy is still attracting attention, this blog post by Dr Jarrod Gilbert, a sociologist at the University of Canterbury, and the online attacks on University of Otago health researcher Dr Lisa Te Morenga, by Jordan Williams and Carrick Graham of Dirty Politics fame.
What does this all add up to? I’m not sure that I know. Is it a coordinated anti-science campaign, or do we scientists take for granted a public faith in science that is simply not there?
What I do believe, is that any privilege we have as scientists is a privilege based on public trust in scientific activities. Such trust should not be based on myths about scientific objectivity, nor on nonsense about us being best placed to make policy decisions: it should be based on a culture of honesty and integrity, and of open criticism and discussion of the facts without fear nor favour. These are the values that make science work, and we (scientists) need to stick up for them. And we (the public) need to stick up for them too.
A lot has been said recently about the importance of evidence-based policy. Science has an important role to play in informing – but not dictating – policy that is responsive to the needs of the real world. But as the dust settles in the aftermath of our most recent election, I am left thinking of the importance of evidence-based policy making – the open discussion of data, with due acknowledgement (as required by the Royal Society of NZ’s Code of Ethics) of the different values that may impinge on a research problem, and that ways in which researchers may reach different but supportable conclusions on the basis of a different research approach.
Is this, after all, what democracy might look like between elections?
[i] I am particularly intrigued by rule 5, under which members are required to be unbiased in their evaluation of colleagues work – which rings a little hollow in the context of studies that demonstrate the pernicious nature of (unconsicious) gender bias in the scientific community. A requirement that members take account of the latest offerings in unconsious bias training might well be in order.
Nicola Gaston is Senior Lecturer in chemistry at Victoria University of Wellington, Principal Investigator at the MacDiarmid Institute and President of the New Zealand Association of Scientists.