August 3, 2018   6 mins

Humans are not fully rational creatures. We see the world through a collection of biases and rules-of-thumb which, in systematic and predictable ways, make us believe things that are simply wrong. They incite us to cling to arguments which support our beliefs, and reject arguments that don’t support them, in the face of contradictory evidence.

This is the motivating force behind large parts of our toxic discourse: it’s why left-wingers not only don’t share opinions with right-wingers, they often don’t even share access to the same facts. You can see it, right now, in the furious war over antisemitism in Labour. If you want to believe that Jeremy Corbyn’s Labour is antisemitic, it’s easy to find examples. If you want to believe that Corbynite antisemitism is a Tory smear, it’s easy to find Tories doing that smearing. Our minds are geared to pick up the things that support our arguments and ignore the ones that don’t.

Bias is the motivating force behind large parts of our toxic discourse
-

In the last 40 years or so, psychological science has uncovered a large number of the biases behind this tendency. For instance, there’s the availability heuristic. It means that we judge the likelihood of something happening not by any sort of statistical process, but by how easily we can think of an example. And that means that we tend to think of more dramatic, memorable, or widely reported things – plane crashes, shark attacks – as being more likelier to occur than they are.

This makes us make bad decisions: for instance, in the year after 9/11, around 1,500 more people died on American roads than usual, because the terrifying images made the idea of hijacking and the plane crashes more available to people’s memories. But, in fact, flying is far safer than driving. The availability heuristic killed half as many people as died in the two towers themselves.

Then there’s scope insensitivity, which makes us blind to numbers. For instance, in one study, three groups of people were asked how much they would spend to save X seabirds from an oil spill. The first group was told that X was 2,000; the second, 20,000; the third, 200,000. The three groups’ answers were, respectively, $80, $78, an $88. Apparently we don’t think about the numbers; we just picture a sad bird covered in oil, and put a dollar value on how sad that picture makes us feel.

This has obvious implications for policy: when we read that the NHS is denying a child an expensive cancer drug, say, we become appalled at the image of the child dying because we wouldn’t spend money – but don’t think about whether that same money could save several children with less dramatic but more tractable diseases.

The affect heuristic is our tendency to assume that if something is good in some respect, it’s good in all respects: so if we think nuclear power has lots of benefits, we also think it has few risks, and vice versa; if someone is attractive we tend to assume they’re clever and kind, etc.

If we go back to our antisemitism-in-Labour example, this is pretty obvious. If you think that Jeremy Corbyn’s Labour is good in some ways – say, that it will be good for poor British people, or will be less likely to engage in wars overseas – it becomes very difficult to, at the same time, believe that it might be systematically racist against Jews. The two claims might be completely unconnected, but we find them hard to hold in our heads at the same time.

Most relevant to the sort of argument that goes on on the internet, is the bias called ‘motivated scepticism’
-

And, probably most relevant to the sort of argument that goes on on the internet, is the bias called motivated scepticism. This is highly related to the term ‘confirmation bias’, which you’ll definitely have heard of. In his book, The Righteous Mind, social psychologist Jonathan Haidt describes it like this: when we want to believe something, “we ask ourselves, ‘Can I believe it?’ Then … we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe.” Whereas if we don’t want to believe something, “we ask ourselves, ‘Must I believe it?’ Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it.”

That’s exactly what’s going on, again, in the antisemitism row. If you are instinctively opposed to Corbyn’s Labour, it is amazingly easy to find examples of him and the people around him behaving in extremely dodgy ways, giving you permission to believe it. If you’re instinctively in favour, then you can find examples of right-wingers using it as a stick to beat Labour with, and there’s your permission to dismiss it as a smear.1

It’s usually extremely obvious when your political opponents are employing motivated scepticism, and you will find it extremely easy to tear their argument down, especially since you are clever, and sophisticated, and aware of all of these biases.

Which is why the last one I’m going to mention is probably the most important. The  sophistication effect, in which the most knowledgeable and politically engaged people, “because they possess greater ammunition with which to counterargue incongruent facts, figures, and arguments, will be more susceptible to motivated bias than will unsophisticates”.

Let’s go over that again: the better-informed and cleverer you are, the more vulnerable you are to certain biases, such as motivated scepticism, because you are more able to destroy the arguments that you don’t like, but still feel no particular desire to examine the ones that you do. If you’re a politically well-informed and intelligent Corbynite, it will be amazingly simple to find the examples of Tories using it as a smear, and vice versa.

So it becomes easy to tear down silly arguments by your opponents, and so you become ever more convinced of your own brilliance and their idiocy or malignity.

But it is hard to apply these skills to yourself and your own deeply held beliefs. For instance, I’m liberal and centre-leftish, and like many liberal leftish people, I tend to think of Islamic terrorism as an overrated threat. I can easily point out that the risk of terrorism in terms of deaths per capita is lower than that of drowning in the bath, say, and I can probably accuse people who think it’s a more deadly threat of both scope insensitivity and the availability heuristic.

But if someone points out that mass shootings in the US, which My Political Tribe is much more scared and angry about, are also very rare, and that I am probably guilty of both scope insensitivity and the availability heuristic, then I am much more likely to push back.2

It is immensely pleasing to tear down the arguments of your opponents. By contrast, it hurts to take your motivated scepticism and apply it to your own tribe
-

This manifests itself as a feeling of fun, or at least as a satisfying activity. For it is immensely pleasing to tear down the arguments of your opponents. By contrast, it hurts to take your motivated scepticism and apply it to your own tribe, where it does not want to go.

But criticising your own tribe is more valuable. For one thing, your real life and internet circles largely consist of your own tribe – if I, a liberal atheist, write something criticising Christianity, it’ll end up on my Twitter feed where it’ll be read by a bunch of liberal atheists who already agree with it.

For another, criticism is harder to ignore, and more likely to change minds, if it comes from someone whose bona fides you trust. Owen Jones saying that Labour has an antisemitism problem is both more praiseworthy and more effective than Steve Bannon’s mate Boris Johnson saying the same thing.

As mentioned, I’m an atheist; I don’t tend to turn to Christianity for wisdom. But there are a couple of quotes we all know, and which are sort of embedded in the culture, which are relevant here. They are Matthew 7:5 – “Thou hypocrite, first cast out the beam out of thine own eye; and then shalt thou see clearly to cast out the mote out of thy brother’s eye” – and Matthew 5:44 – ”Love your enemies, bless them that curse you, do good to them that hate you, and pray for them which despitefully use you, and persecute you.”

Could this be a partial way out of the echo chambers and filter bubbles we are trapped in
-

That seems to me to be a partial way out of the echo chambers and filter bubbles that we seem to find ourselves in: look to the beam in your own eye, and love your enemy; apply the painful kind of scepticism to the things we like, rather than the satisfying kind to the stuff we don’t.

Obviously, though, that’s easy to say – it’s not easy to do. I’m dreadful at it: I still share those satisfying digs at Brexiters or Corbynites, even though when I know about motivated scepticism and all these things. That’s why GK Chesterton, the brilliant and grumpy Christian essayist, grumbled that “The Christian ideal has not been tried and found wanting. It has been found difficult; and left untried.” Insofar as the Christian ideal is loving thy neighbour and casting out the beam in thine own eye, he’s still right.

FOOTNOTES
  1.  For the record, I’m not neutral on this: I do think Corbyn’s Labour has a problem with antisemitism. That doesn’t mean people aren’t using it for political ends, but it is real.
  2. But it’s still true – your risk of death from one, if you live in the States, is about 1 in 15,000, according to the US National Centre for Health Statistics, which is about three times as likely as being killed by a “foreign-born terrorist”

Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers