
Why people turn a blind eye to evidence
Author: Hugo Fleming
Editor: Jennifer Marx
Artist: Sophie Maho Chan
At the beginning of this year, as the seriousness of the coronavirus crisis was just becoming evident, a number of theories about its origins also started spreading online. One claimed that the virus had been manufactured in a lab in Wuhan, another that the cause of the disease was not a virus at all, but radio signals from new 5G phone networks. However, these ideas were quickly debunked by scientists, and it has been tempting for many people to dismiss the believers as simply ridiculous. Look for example at the mockery Eamonn Holmes received when he appeared to give credence to the 5G conspiracy on This Morning. Yet, as comical as they sometimes seem, unscientific beliefs can have serious consequences, as evinced by the spate of arson attacks on phone masts in recent months.
Unorthodox, even antiscientific views were already on the rise before coronavirus, aided by the internet and the growth of social media. Our collective fascination with the ‘flat earth movement’ was only the most bizarre example on a list that includes, more seriously, climate change deniers and anti-vaxxers. What drives our curiosity is not just that these are people who hold and promote unscientific views – religion for example isn’t nearly so interesting – no, the people who really fascinate us are those who engage with science on its own terms, who claim to follow the evidence, yet still reach positions that are at odds with the scientific consensus. This speaks to our sense, and possibly our worry, that other people’s experience of the world may be completely different to our own – how can we trust our own beliefs, when others can see the same evidence and reach an entirely different conclusion?
Fortunately, there is a long scientific tradition of studying the processes of reasoning in humans, and now these older psychological theories are being joined by newer ideas from the computational sciences. These approaches are beginning to explain how people process evidence very differently, and why we can nevertheless all feel we are thinking rationally.
What kinds of people engage in unscientific thinking?
One of the most obvious features of ‘science deniers’ is that they are utterly unmoved by contradictory evidence. Nothing, it seems, could make them change their mind. For example, the fact that the earth is (roughly) spherical has been known for millennia – certainly, it was known to the Ancient Greeks and possibly even the Egyptians before them. Since 1968, when the famous photo ‘Earthrise’ was taken, we have also had readily accessible evidence of the shape of the Earth in the form of photos taken from space. To continue to maintain that the Earth is flat in the 21st century therefore requires ignoring or discounting huge volumes of evidence.
Earthrise: One of the first photos of the Earth from space. Image Credit: NASA
Particularly on issues that seem so settled and obvious to us, it is easy to write off people who disagree with the scientific consensus as being somehow ‘fundamentally different’, at best irrational, or even simply stupid. But the science suggests otherwise.
A recent meta-analysis has found no evidence of a link between the tendency to believe in conspiracy theories and personality traits (specifically the ‘Big 5’: openness, conscientiousness, extroversion, agreeableness and neuroticism). The same meta-analysis does note a link between conspiracy beliefs and level of education, as well as a construct termed crystallised intelligence, which refers to how much concrete knowledge one has. But the idea that people who believe conspiracy theories are stupid, in the sense of having poor cognitive ability (fluid intelligence), is not supported.
In fact, such beliefs are common – one study in the US estimated that around half of the population endorses at least one conspiracy theory – and cut right across political and demographic lines. Although these studies were looking at conspiracy theories specifically, it’s reasonable to think that the results apply to unscientific thinking more generally.
In all, it seems there is very little in terms of stable traits and personality features to distinguish the science deniers from those whose beliefs are aligned with the scientific consensus. But in that case, what causes people to arrive at unscientific beliefs, and more to the point, to then become resistant to new evidence and changing their minds?
Unscientific beliefs don’t arise by accident
One of the essential functions of the brain is to make predictions – neuroscientists describe the brain as trying to model the world, in the sense that it literally constructs an internal representation of its environment in order to simulate and predict how the real world will behave. If we encounter something that we didn’t predict, our brain changes its model to better fit the data, in order to predict this event next time.
This is why, often, science denial associates itself with new, unexpected events (like the current coronavirus pandemic) or around the fringes of existing scientific knowledge (such as the causes of autism). Uncertainty is uncomfortable, and people seek to resolve this however they can. Often, this means that a possible but false explanation (such as ‘vaccines cause autism’) is more satisfying than the true answer: ‘we simply don’t know yet what causes autism’.
One way to understand this process is through what psychologists call inductive reasoning – the way that people observe correlations between events and then infer the underlying causes. Human inductive reasoning is not completely neutral or rational, however, and involves a number of biases and mental shortcuts (called ‘heuristics’) which have been extensively studied.
Particularly important, according to cognitive scientist Micah Goldwater, is the fact that we often find it easier to ‘add to’ our mental models rather than take away – so instead of rejecting a false belief when contradictory evidence comes to light, we invent secondary hypotheses to explain away the new data while allowing us to maintain the old belief. For example, one recent conspiracy theory has involved the drug chloroquine, which has been suggested as a potential treatment for COVID-19. If you initially believed that chloroquine was effective, you might respond to negative trial results not by re-evaluating your existing belief, but by invoking a new explanation – ‘Big Pharma is suppressing the results of positive trials’.
Other heuristics mentioned by Goldwater include the well-known confirmation bias and the tendency to give greater weight to anecdotal over statistical information. There is also something called the illusion of control: let’s say I get ill and decide to take a herbal medicine; if I get better, I might decide it’s because of the medicine I took, without acknowledging the chance that I would have got better anyway.
It was precisely to counteract this kind of faulty reasoning that the scientific method was developed. The Randomised Controlled Trial, for example, is regarded as the gold standard for experiments because it allows scientists to overcome the illusion of control. Unfortunately, we find it very hard to apply this kind of thinking in our everyday lives, and as a result it can be all too easy for unscientific beliefs to take hold.
Resistance to evidence – the key feature of science denial?
Cognitive biases and heuristics – the mainstays of traditional psychology – go some way to explaining how people first arrive at unscientific conclusions. But they don’t account for perhaps the most interesting aspect of science denial: how do these beliefs become so entrenched that people become blind to opposing evidence?
One possible answer can be found in an idea called Bayesian reasoning. First proposed by Thomas Bayes in the 18th century, this approach is proven to be the best way to reason about uncertain beliefs, but the difficult maths involved has meant it’s only really become popular in the past 20 years, as more powerful computers have become available. Neuroscientists and psychologists are now very interested in the idea that our brains may, at least to some extent, be described as Bayesian machines.
The essence of Bayesian reasoning is that we start with certain prior beliefs, in which we also express a degree of confidence (for example, someone might have started to believe, with moderate confidence, that the Earth is flat). With new information (e.g. a picture of the curvature of the Earth taken from space) we update both our belief and our confidence in it (so this person might now be slightly more confident that the Earth is round after all). But the really crucial idea is that we also specify our confidence in the new information – so if we have doubts about this then our beliefs won’t change as much.
In the ‘Bayesian Brain’ theory, all these beliefs and confidences are things we have to learn. In particular, if we’re already near certain about a belief we have and then receive contradictory evidence, we might choose to reduce the confidence we have in that source of evidence rather than change the belief. This means that, returning to the example of the flat earth believer above, if they are already sure in their belief, then seeing a photo of the Earth from space is evidence not that the Earth is round, but that photos from NASA are not to be trusted.
The same process is thought to underlie all sorts of false beliefs. Once we are certain about a particular belief, we don’t just turn a blind eye to evidence – we use this very evidence as justification for not trusting that source of information in the future.
So, in the case of science deniers, it seems that entrenched and unscientific beliefs can come about through the normal processes of learning and reasoning. Particularly in a world where personalised news and social media echo chambers are more prevalent than ever, it seems all too easy for someone to develop a set of beliefs in which they are excessively confident, and these can prove very difficult to adapt.
In a sense, contrary to the view that science deniers are merely ‘irrational’ or ‘stupid’, we might instead view them as unlucky – unlucky to have been exposed to a sequence of experiences and information that means they no longer trust contradictory evidence. For the rest of us, this is perhaps a timely reminder to be cautious in our beliefs and mindful of the question – what would it take to change my mind?