What happens when what you know turns out to be wrong?
From the August 2015 issue of The Rotarian
One evening, sitting in the back seat of the car, our two girls, ages six and eight, were discussing the show we were on our way to attend. Called The Illusionists, it featured seven of the world's top magicians. The debate consisted of whether there would be real magic involved, or just tricks.
"When they cut the man in half," our younger daughter asked, "how do they keep the blood in?" She was convinced there was true magic. Her older sister, a little wiser, wasn't buying it.
"Easy," she said. "R-o-b-o-t." She rolled her eyes at how obvious this was.
During the show, sure enough, we came to the part where a man – standing up, no less – was sawed in half. His torso fell onto a table, while his legs walked offstage. His top half was wheeled around before us, perfectly animate, perfectly alive.
It was clearly not a robot. Yet what it was, none of us could imagine. And even if we could have found out how it worked, I'd almost rather not. Because in a sense, both girls were right: There was real magic and there were tricks. The magic is in wondering how you were tricked. That's why we go to see performances like the Illusionists'.
Humans are not hard to deceive. If we were, most political careers would be much shorter. Our gullibility has even played an important role in our own evolution. In his book, Sapiens: A Brief History of Humankind, historian Yuval Noah Harari argues that the agricultural revolution was a trick played on humans – by plants. Wheat, rice, and potatoes, with all their delicious flavors, enticed us to spread their DNA around the planet by trading our hunter-gatherer lifestyle for a much tougher farming one. Wheat went from being a wild grass in Turkey to covering an area of the earth 10 times the size of Great Britain. We like to tell ourselves that we conquered nature with agriculture. But, Harari says, the opposite is true: "These plants domesticated Homo sapiens, rather than vice versa."
Deception may not always be a bad thing: There's some evidence, for example, that depression is the result not of a distorted view of reality but of an all-too-clear one, known as "depressive realism." Do hope and optimism actually require a certain amount of self-deception? Psychologist Joanna Starek found that athletes who scored higher on measures of self-deception also performed better, suggesting that sometimes you may have to trick yourself into believing you can do something before you can do it. And if what you're trying to do has never been done before, you must first deceive yourself into believing it can be done.
There is, of course, a dark side to self-deception: Our beliefs and biases blind us to things we need to see. As the philosopher Friedrich Nietzsche once observed, "Convictions are more dangerous enemies of truth than lies."
Political scientists Brendan Nyhan and Jason Reifler conducted a series of experiments that show how true this is. We all have biases and ideologies, and those can lead us to become badly misinformed, even about basic facts. In various studies, Nyhan and Reifler gave people empirically correct information in the form of a news article meant to correct a misperception – that Barack Obama is Muslim, that Mitt Romney shipped jobs overseas, that cutting taxes increases revenue to the treasury, that the measles-mumps-rubella vaccine causes autism. All these statements are false, so correcting them should be a simple matter of presenting people with the correct information. But as with the Illusionists, things are not quite as they seem. What Nyhan and Reifler found was that presenting the correct information often had the opposite effect: It caused people to believe the wrong information even more strongly.
They call this the "backfire effect," and the reasons behind it are complicated. One is the echo chambers people create for themselves in the friends they associate with and the media they follow. Another has to do with "motivated reasoning."
"In any given situation," Nyhan says, "you have some level of motivation to determine the correct answer. That's called accuracy motivation. And you also have some motivation to find the answer that you'd like to be true."
We each have an ideology – an idea about how the world works – and we want information to fit into that scheme. When we're presented with facts that don't fit, or worse, that contradict our beliefs, we often choose to dismiss them rather than to reconsider our assumptions.
After all, who has the time and energy to come up with a new ideology every time we're wrong about something? There is a particular kind of panic that sets in when our belief system begins to crumble. It's easy to have a clear system that tells you what is right and what is wrong. It is much harder to ask, at every juncture, whether you could be wrong.
More recently, Nyhan and Reifler have begun researching ways to counter the backfire effect. One strategy that seems to work is to present correct information in graphic form, rather than as text. Another is what they call "affirmation" – having subjects write an essay about a value that has been important to them, or a time when they felt really good about themselves.
The latter approach increases the rate of correction among some (though not all) subjects. The underlying idea is that by reaffirming our values and self-worth, we don't feel the same level of "identity threat" from information that runs counter to our beliefs. Being aware of our values can make us more open to new information that goes against those beliefs.
There is no easy answer to this dilemma, no magic formula for how much clarity and how much cloudiness we should expect in life. But none of us will ever see the world with perfect accuracy all the time – nor would we want to. In a sense, each of us is the illusionist in our own show. The secret is to know when to be brave and pull back the veil on our own tricks, and when to leave it drawn and simply wonder at the magic.