The Future of Science Communication?
Today I read an article in Nature Climate Change that I found rather thought provoking. It reports the findings of a social psychology study into the effects of science cognition on perceived climate change risk, and it had some very interesting results. The premise is that people are not willing to accept sacrifices in order to combat climate change because they do not understand the science. The study was designed to test this premise, as evidence that to convince people, a strategy of communicating sound science should be adopted.
What the study actually showed was the reverse: that is, that people who are more educated about science are more likely to be climate skeptics, and that those who are uneducated are more likely to be worried about it. I haven't given much thought to the validity of the methodology, but the authors do not attempt to deny what appears to be a clear link, and it appears to be robust. Let me predict at this point that if you are a climate skeptic, you will probably think this is unequivocal evidence that, at the very least, the case for climate change is not as close to consensus as some would like. This is not the conclusion of the authors.
The authors' explanation of the results is, and I'm paraphrasing, that a greater comprehension of the science will lead a certain type of individual to come to a wrong conclusion. Specifcally, that their "form of reasoning can have a highly negative impact on collective decision making". Again to the climate skeptic this will sound like an utterly counter intuitive conclusion, but the theory goes that individuals do not actually form their opinions based on the facts in front of them but rather their "cultural commitments", i.e. what they and their friends, family, coworkers etc already believe.
Whilst at first I was incredulous that the scientists could completely fail to see the possibility that climate skepticism may have real grounding in scientific evidence, I have to admit that I have observed a similar phenomenon to the one they describe many times before. In fact it has been a thought bouncing round my head for some time: that ultimately people very rarely change their opinions on almost anything, and that debate is largely ineffective. When was the last time you had a debate in which someone listened to the argument of a person with an opposite belief, and subsequently changed their mind? Indeed, studies on the ethics of genetic testing have shown that giving information alone about risk is hugely unsuccessful in changing behaviour: something that is easily exemplified by the abysmal results of using information about future healthcare problems to encourage people to make diet and lifestyle changes.
What I have theorised is that two things in particular influence this "self reinforcing" effect of information on opinion formation:
- Ascertainment bias: that is, if you have doubts about NHS reform, you are more likely to read an article examining its downsides. So you only read the things that re-affirm your existing position.
- Confirmation bias: that is, when you read the results of a scientific study (for example), you subconsciously choose to interpret it in a way that fits in with your world view. So if it contradicts your previous position, you hunt for the weaknesses in the methodology to allow yourself to discredit the findings, and if it fits you hold it up as clear evidence of how right you are.
In short, either because we hate being right or because we hate going against our local consensus, we will often use bad logic in order to save ourselves from having to change our opinion. To use the example above of finding faults in the methodology of a study, we will happily make the leap from "some part of this may not be true" to "none of this can be true" if it is in line with our existing beliefs. Or, we choose between different possible explanations for a result. A classic example of the latter is seen in the discord between the scriptured age of the Earth and the existence of fossils. That is, some conclude that the existence of 65 million year old fossils disproves the notion that God created the world a few thousand years ago, whereas others conclude that God made the fossils age as a test of faith. Within their respective contexts, both are logically possible [this is how the "who created God" tautology works too].
So what does this all mean for the communication of science? The authors of the study are clearly operating on the assumption that the "right" thing to do is to convince people that climate change must be addressed, so do not consider any other conclusions from their results. Within this context, they argue that governments should abandon policy to communicate sound science, and replace it with investment in a new scientific discipline, the "science of science communication". They argue for the adoption of "information framing" techniques, i.e. contorting [I would prefer a less evocative word] the facts in order to convey a pre-decided message. I have long believed that what we need in politics is more focus on the facts and less on the partisan elements: good governance informed by sound evidence. In fact it had not really occurred to me that scientists would disagree, and yet here are scientists arguing for more emphasis on (and research into) how science communication in general can be metered through some form of manipulation process in order to bring about a specific goal.
To be clear, what is being proposed is the development of scientifically robust methods to misrepresent science for the purposes of manipulating public opinion. This sounds to me like abusing science in order to capitalise on its good will. If this were a government department, it would be called the Ministry of Truth. Given the arguably already weak impact of rational science in forming opinion and policy (especially in the wake of Climategate), I can't imagine that tarnishing the reputation and trust afforded to scientists will do anyone any good: both science and climate change campaigners included.
Add new comment