No one likes to change their mind, not even on climate, by Ullrich Ecker and John Cook

mona lisa mr bean image
When you look at only a tiny bit of the picture, that's cherry picking. Click to view the original. Image: The Conversation

The recent ABC documentary I Can Change Your Mind About Climate was about two people — conservative former politician Nick Minchin and youth activist Anna Rose — exposing themselves to information that ran counter to their deeply held beliefs. We know from both research and experience that people cling to information that is in line with their beliefs and worldviews, even when they suspect or even know the information to be false. In other words, people will defend their beliefs. To do so they engage in “motivated reasoning”.

There can be various reasons for motivated reasoning. People might be defending their beliefs in an attempt to protect their feelings of identity and self-worth. Your deepest beliefs about the world define who you are, and hence you need to defend them to defend yourself.

On the other hand, people sometimes publicly defend their beliefs even though they know they are wrong. It could be an attempt to rationalise irrational behaviour, or justify decisions that are in actual fact driven by vested interest or a hidden agenda.

Using the documentary, let’s have a closer look at the strategies people use to defend their beliefs and purport rationality.

Denial

A brute strategy, of course, is outright denial. One can either deny the evidence (for example, Minchin’s claims that “there is no empirical evidence” and that science is “just opinion”) or one can deny the possibility of change (Minchin’s “even if it were true we couldn’t change it” attitude).

These are the more basic tools to defend one’s worldview, and they are usually quite easy to counteract by — you guessed it — presenting the evidence or the necessary course of action. Only a few people then continue with outright denial; in the face of clear evidence it can only be upheld by reverting to conspiracy theories, for example that the world’s climate scientists have conspired to “conceal the evidence” (former carbon modeller David Evans, who appeared in the show) in an attempt to somehow take over the world. (Remember, it was scientists who brought you vaccinations, the internet, weather forecasts, clean water, antibiotics …)

Counter-arguing

A more sophisticated approach to defend one’s views is counter-arguing. Rational counter-arguing is a great tool, and it’s basically what scientists do all the time.

The problem arises when people start counter-arguing established facts. You can argue all day that you have found a special apple that won’t drop to the floor when you release it, but that won’t change the laws of gravity.

Evans’ counter-arguing of temperature measurements as inaccurate (because they could be influenced strongly by local factors such as airport traffic) might seem reasonable, but it was exposed by Professor Richard Muller as flawed.

Cherry-picking

As pointedly noted by Yale scientist Anthony Leiserowitz, one of the main players in motivated reasoning is the confirmation bias. That is, we tend to pay more attention to information that reinforces our attitudes than to information that is at odds with our beliefs and decisions.

This is a normal tendency — if you just bought a new car you will eagerly read a positive review, and pat yourself on the back about the great purchasing decision you have made. You will ignore the negative aspects in the review (who needs a full-size spare anyway?).

However, the confirmation bias becomes a real worry when people consciously take this bias to another level, and cherry pick data to purposely mislead. When you cherry pick data, you are not really lying — for example, by saying the image above shows Mr Bean — but you are selectively focusing on what fits your cause, and ignoring the rest. So while the image does show Mr Bean (have you found him?), most people would agree that that’s not an accurate description of the full picture.

In the documentary, a graph presented by Evans was a prime example of cherry picking. It showed eight years of relatively constant upper-ocean temperatures while:

  1. not showing the dramatic rise in upper-ocean temperatures in preceding decades, and
  2. ignoring all the heat building up in deeper waters. (When you consider the full ocean, we see a steady build-up of heat.)

All of blogger Marc Morano’s arguments (dropping sea levels, growing Arctic sea ice, falling temperatures) were just that, cherry picking. One ice shelf growing is not evidence for a cooling trend if there are 100 other ice shelves shrinking at the same time.

Misuse and misinterpretation of uncertainty

One of Minchin’s main points was that climate science does not provide sufficient evidence to make us act. This could be a purely strategic move to inflate the perceived uncertainty in order to avoid or at least delay the socio-economic implications of climate change. (Professor Naomi Oreskes makes this point in an out-take of the documentary, which you can watch here.) But if we assume that Minchin really believes what he is saying, it demonstrates the fundamental flaws people show when dealing with uncertainty.

Absolute certainty is a rare thing. We like it, but it is rare. Yes, we will all eventually die; yes, the sun will rise tomorrow (it will, right?) and yes, if you hit the freeway on a Monday at 5pm you will see other cars there (unless perhaps the sun didn’t rise).

But really, when we speak about the predicted behaviour of any complex system, there is no such thing as absolute certainty. Waiting for 100% certainty before you act is hence a fallacy, but if you want to defend your beliefs you can play the uncertainty card in many contexts. You can smoke in your car with your kids in the back seat, arguing that the link between passive smoking and cancer is not 100% certain. That may even be true, but what is certain is that you wouldn’t be doing your kids a favour.

The argument that there is not enough certainty to act carries another fallacy: our tendency to assume that uncertainty is uni-directional. Minchin obviously takes the probability that the climate models’ predictions could be wrong to mean that there’s a good chance that it will all be fine.

Of course, stating there’s a 90% chance that Earth’s temperature will increase by 3±1.5 degrees Celsius over the next 70 years does not mean there’s a 10% chance that everything will stay the same — because things could also be worse than predicted, a risk that is commonly neglected when defending one’s beliefs.

Together, the misuse and misinterpretation of uncertainty often combine to justify doing nothing. But of course, doing nothing is also an active choice. Delaying action until there is “sufficient” evidence is as much a decision and an action as the arguably more rational choice to act according to the best available estimates.

Ironically, people will take politicians’ inaction to imply that there is no problem — if they’re not doing anything, then it can’t be serious. This flawed backward reasoning is known as inferred justification and often contributes to the perpetuation of false beliefs.

Discrediting the information source

While Rose’s refusal to engage with a condescending Morano is perhaps understandable, calling him a “Republican attack dog” rather than publically exposing some of his aggressively made claims is an example of an ad hominem argument. Typically, people use this when they cannot (or in this case: choose not to) address the opponent’s arguments: ignore the arguments and attack the person.

Another discrediting strategy is ridiculing your opposition. Examples were blogger Joanne Nova loudly laughing at Rose suggesting climate change may have severe consequences, or Minchin cracking a joke when a farmer expressed existential fears about the consequences of rising CO₂ on him and fellow farmers (“it’s plant food, mate”).

The perhaps most creative way to discredit your opponent is to turn things on their head and accuse your opposition of what you are doing yourself. In psychology, such behaviour is called projection. Morano telling Rose to “re-examine [her] conscience” was probably the funniest moment of the documentary (in a disturbing way). Minchin denying the existence of empirical evidence but accusing Rose of “ostrich behaviour” when she refused to engage with Morano was another example.

Deflection

Finally, changing the topic when you fear you are losing the argument is a common strategy to defend your beliefs. In a passionate outburst, Minchin accused Rose of deflection after their conversation with Professor Richard Lindzen — during which Rose brought up Lindzen’s denial of the health effects of tobacco.

It was a clear case of deflection. But knowing that Lindzen, one of the very few climate scientists doubting man-made climate change, has also been doubting the link between tobacco smoke and lung cancer does seem somehow relevant. But the deflection was not the only reason Minchin was getting so worked up — he, too, is on the record for questioning the adverse effects of tobacco smoke.

I can’t change your mind about climate, can I?

What this tells us is that hardcore believers have many ways to cling to their beliefs and rationalise this behaviour even in the face of overwhelming evidence that their beliefs are wrong. So once the science is settled – and in the case of climate change it really has been for quite a while now – it is actually a really good idea to move on and discuss and implement a course of action (as also suggested by UK conservative Zac Goldsmith), rather than trying to convince the unconvincible.

Ullrich Ecker is an Australian Postdoctoral Fellow at University of Western Australia and John Cook is a Climate Communication Fellow at University of Queensland.

Did you find this article useful? Join the EB Circle!

Your support helps keep our journalism independent and our content free for everyone to read. Join our community here.

Most popular

Featured Events

Publish your event
leaf background pattern

Transforming Innovation for Sustainability Join the Ecosystem →