Have you also noticed the teeth gnashing in intellectual circles? Just check out serious magazines like The New Yorker, Scientific American and The Economist. They’re all talking about how we’re not half as logical as we think we are. And with good reason. The pile of research about the subject is getting bigger by the day.
- There is cognitive dissonance. This says that if we hold two beliefs that are contradictory, we’ll distort one of them until they once more get along. Often – like I discussed in my article 3 Big Mental Mistakes – we prefer distorting the evidence rather than our dearly held beliefs.
Then you’ve got the confirmation bias. It makes us disregard evidence that contradicts our position while paying extra attention to the stuff that confirms it. Heck, we even forget events that contradict our held beliefs!
As if that wasn’t enough, there is the backfire effect. Here when you show people evidence that their core beliefs hold no water, they don’t change their minds. Instead, they come to believe more strongly in their original positions.
There are plenty of books about the topics as well. Like The Knowledge Illusion, which shows we believe we know far more than we do. Then there is Democracy for Realists. In this (rather dry) tome the authors show how bad we are at democracy and how our held beliefs on the subject turn out to be all wrong. The Enigma of Reason, in the meantime, attacks our very beliefs about how reason works and what it is for.
It is unsurprising how alarmed people are by all this. After all, we’ve built our societal models on the idea that we’re rational actors who will be convinced by good arguments when we hear them. If that turns out not to be true, then we’re in trouble. We have to consider some important questions, like:
- If people aren’t as rational as we believe them to be, can we still expect them to make objective decisions?
- If they do need guidance, how much hand-holding is reasonable and when do we go too far?
- And the biggest question of all, if we can’t trust people to make the right decisions, is democracy still viable?
In fact, numerous people have already taken to shouting we should stop asking and instead start telling people what to do. (No doubt the shouters imagine that they’ll get to do the telling.)
But is it really so dire as all that? After all, it’s not like opinions don’t shift over time. Sometimes they even shift quite quickly, like it did for gay rights in the US. As a graph in an article by the Washington Post shows, in the space of six years, there was a 20% shift in favor of gay marriage.
Even though for many this was a strongly held belief they still changed their minds. And that’s not that surprising. People change their minds all the time. They join religions or leave them, start on one end of the political spectrum and end up on the other, and they pledge eternal love only to get divorced soon after.
So which is right?
Well, you can’t dismiss the research. There is too much of it. Sure, individual aspects might not be as robust as the researchers claim. But overall it holds and points in the same direction. When you attack an individual’s core beliefs using facts and statics, they’ll rarely change their minds.
At the same time, it’s as true that over time society changes. Things that weren’t okay before now are and things that were fine aren’t anymore. These changes are quite profound.
- Who can vote has changed, with minorities and woman being empowered.
- Who can sleep with who has been dramatically revised. Not so long ago even interracial relationships were considered wrong. Now gay marriage is acceptable to people in most modern democracies.
- Our view on drugs are shifting. Fewer and fewer societies advocate jail time for offenses. Some drugs are even becoming acceptable.
And so on and so forth.
So, we have to go for door number three. Both must be true. But there has to be some sort of difference between our individual dislike of changing our minds and how society changes anyway.
I believe that though these two notions seem incompatible, they aren’t. Instead, we’re comparing apples and oranges. Or perhaps I should describe it as comparing things at the wrong level. What if we struggle to change our minds as individuals, but find it a lot easier to do so as groups?
And sure, at first blush that does sound counter-intuitive. After all, groups are made up of individuals. So if individuals struggle to change their minds, then why would it be any easier for groups?
For several reasons, actually.
Reasoning towards cooperation
First of all, we might completely misunderstand reasoning. We think it’s for thinking abstractly about problems, for playing chess or for doing mathematics. But reasoning did not evolve for those skills. In our past there was no chess, we didn’t do mathematics and we very rarely reason abstractly. Instead, those skills are more like how we now use our nose to hold up glasses. It’s a side effect.
So what is the original reason for reasoning to evolve? Well, a good candidate is for group bonding. Yes, that sounds completely illogical. But hear me out.
Cooperation is one of the biggest problems our species had to solve. How do we get big groups to cooperate? The reason it’s such a hard puzzle is because we couldn’t use the same solution as other species, like ants and bees, use. Our families aren’t extensive enough to rely on kinship.
So we needed something more. One candidate for cooperation is our belief systems. If we all agree that the same things are important to us, then it becomes far easier to work together. What’s more, the surer we can be that other people stick to our belief system, the better we can cooperate.
The most interesting thing about this is that it turns the problems I mentioned above from bugs into features. Suddenly, it makes sense for us to have the confirmation bias and the backfire effect. They’re there to defend cooperation, while completely logical reasoning would be problematic. After all, in the short term, they’d have us betraying each other far quicker, which would undermine long-term cooperation. And so, mechanisms evolved to keep our belief system in line with those around us.
Interestingly, this would mean we don’t only have systems in place to make us stick to our original beliefs. We’d also have mechanisms which make us sensitive to how our immediate peers are changing their opinions. I don’t know what the research about that is.
Change happens at the edges
My second point is related to the concept of core beliefs. You might have noticed me using that phrase above. I explained how when you attack the beliefs that form a part of their identity they won’t change their minds.
But of course, not every belief is somebody’s core belief. And if we hear arguments that undermine something we don’t believe as strongly, we have a lot less incentive to protect them. And so, it is possible to change people’s minds using evidence.
What it depends on is how important those beliefs are to them.
That’s where my third point comes along. Peer pressure. If the majority of my peers hold a belief that is contrary to my own, I’ll find it difficult to hold onto it.
Sure, I can. But it isn’t easy.
This was already demonstrated back in in 1951 by Solomon Asch in his famous line experiment. In case you don’t know it, a group of people is asked to compare three lines to a fourth one and select the one that was the same length. This seems like an easy enough task, right? And when participants were asked alone, it was. They always picked the right line.
When they were the last person in a group to decide, though and everybody else chose the same wrong line, it was no longer so easy. Suddenly a third of participants caved to peer pressure and confirmed to the group.
So imagine how powerful it would be when the answer isn’t as obvious.
When everybody around you tells you you’re wrong, you’re more likely to think you are too. And when you change your mind that means there is even more pressure on those who remain loyal to the original idea. And so sometimes belief systems change rapidly, like falling dominoes.
And sure, there will be those who refuse. But time gets rid of them soon enough, even while the next generation is more likely to accept the majority’s opinion. Until finally it’s no longer a question. For that reason, nowadays few people question if women should be allowed to vote. Instead, they accept it as a given.
In this way, society changes its mind even while individuals don’t.
The wisdom of the crowds?
Of course, this begs the question, will society choose wisely? In his book The Wisdom of Crowds James Surowiecki argues that they do. The entire premise of his book is, in fact, that crowds are wiser than individuals.
He uses a much-quoted example to show this. People were asked to guess how many peas there are in a glass jar. It turned out the average of the crowd was closer to the number than any individual is. This, he argues, demonstrates that crowds are wiser than individuals. There are a few problems with this, though.
- It’s pretty straightforward to guess how many peas there are in a glass jar, unlike – say – knowing which healthcare plan is the best for your country.
- Nobody has some innate belief system they are trying to defend a certain number of peas. Nor is anybody raising their children to see more or less of them.
- People aren’t running some campaign or trying to influence government to change how many peas we think there are.
- There are no innate biases which will make people on average guess too high or too low.
All these things are plenty common everywhere else. And when they are, the crowd becomes a whole lot less wise. That means you can’t expect the wisdom of the crowds to be a poultice for the ignorance of individuals.
There are plenty of examples of this. Take how much time we spend on terrorism versus how dangerous it actually is, our belief about foreigners (like they commit more crimes), or how our war on drugs, to name but a few.
That said, there is still hope here. Society as a whole can change their minds. They can even learn new practices and ideas to help compensate for our inner demons.
For societies build up:
What’s a cultural antibody? Well, in short, it’s something that helps us deal with a societal problem. There are plenty of examples, like how society is fighting the obesity epidemic. Another good one is how people are becoming ever more skeptical of advertising. Even the current rejection of authority can be described in this way.
But I’m going to turn to another example – and that is smart phone use.
Ever more often there are unwritten rules telling us where we should and should not use phones. We shouldn’t use them at the dinner table. We should avoid them in the cinema. It’s rude to look at your phone while somebody is talking to you.
And when you do so people give you dirty looks. Friends tell you off. Places have even sprung up that ban screens. In other words, while smartphones are addictive, society as a whole is developing antibodies to resist that addiction.
What’s more, though we’ve known this for a while, the influence it’s having is accelerating. This is because more people feel this way, which in turn increases the social pressure (recognize that from somewhere?)
In this way, society learns to overcome its individual’s baser impulses.
Will this help us deal with all our problems? Of course not. Some problems can’t be overcome through social control alone. Other problems – like global warming – are too immediate for us to be able to wait for cultural antibodies to kick in.
But at least it gives us hope. And that’s something when there is this much teeth gnashing going on.