Why optimism bias is good for you
01:52 - Source: CNN

Editor’s Note: Tali Sharot is the author of the new book, “The Influential Mind: What the Brain Reveals About Our Power to Change Others.” An associate professor of cognitive neuroscience, she is the director of the Affective Brain Lab at University College London. The opinions expressed in this commentary are hers.

Story highlights

Tali Sharot: To many of us who study the human mind, the diminishing influence of evidence is less a puzzle than an example of how the mind forms beliefs

We should take our biases into account and use them when trying to convey our truth, she writes

CNN  — 

At a time when polarization is at its peak and truth is a four-letter word, the question on everyone’s minds is: Why does evidence seem to have little influence on people’s beliefs, and what can be done?

Tali Sharot

How is it that citing clear evidence of a human role in global warming fails to persuade climate skeptics to change their minds? Why do people spend money on so-called bio-frequency healing stickers, despite no scientific basis for their effectiveness? And how do individuals on opposite sides of partisan divides reach very different estimates of the number of people who attended the 2017 presidential inauguration, despite photographic documentation of the event?

To many of us who study the human mind, the diminishing influence of evidence is less a puzzle than a prototypical example of how the mind forms beliefs. And the very idea that simply providing people with data would be sufficient to alter their beliefs is condemned to fail.

The very first thing we need to realize is that beliefs are like fast cars, designer shoes, chocolate cupcakes and exotic holidays: they affect our well-being and happiness. So just as we aspire to fill our fridge with fresh fare and our wardrobe with nice attire, we try to fill our minds with information that makes us feel strong and right, and to avoid information that makes us confused or insecure.

In the words of Harper Lee, “people generally see what they look for and hear what they listen for.”

Tali Sharot book jacket NEW

It’s not only in the domain of politics that people cherry-pick news; it is apparent when it comes to our health, wealth and relationships. Many individuals avoid medical screenings in an attempt to evade alarming information. And a survey conducted in 2009 found that people were more likely to check their investment accounts when they suspected their balance had risen than when they thought it had dropped.

My colleague Caroline Charpentier and I found in our research that people are even willing to pay to remain ignorant. In one study, we gave volunteers an opportunity to invest in a simulated stock market. We later asked them if they wanted to find out how well their investment was faring or to remain oblivious. They had to pay if they wanted us to comply with their request for knowledge or for ignorance. When volunteers feared they were losing money, they were more likely to pay so that we would not reveal their balance to them.

This may seem puzzling. But with non-invasive brain imaging techniques, my colleagues and I have recently gathered evidence that suggests our brain reacts to desirable information as it does to rewarding stimuli like food, and reacts to undesirable information as it does to aversive stimuli like electric shocks. So, just as we are motivated to seek food and avoid shocks, we are also motivated either to seek or avoid incoming information.

Of course, we do not always turn away from uncomfortable data. We do undergo medical tests, face our debts and occasionally read columns written by people who hold different political views than ours. But on average we are more likely to seek confirmation of what we believe (or want to believe).

Unfortunately, the solution is not as simple as providing people with full and accurate information. When you provide someone with new data they quickly accept evidence that confirms their preconceived notions and assess counter evidence with a critical eye.

For example, my colleagues and I also conducted a study in which we presented information to people who believe that climate change is man-made as well as to people who were skeptics. We found that both groups strengthened their pre-existing beliefs when the new data confirmed their original position, but ignored data that challenged their views.

Such effects are examples of the confirmation bias. It is not new. But today, as information is more readily accessible and people are frequently exposed to different opinions and data points, this bias is likely to have an even greater role in shaping people’s beliefs – moving ideological groups to extremes.

And while you may assume such biases are a trait of the less intelligent, the opposite is true. Scientists discovered that those with stronger quantitative abilities are more likely to twist data at will. When volunteers in that study were given data about the effectiveness of gun control laws that did not support their views, they used their math skills not to draw more accurate conclusions, but to find fault with the numbers they were given.

Why have human beings’ brains evolved to discard perfectly valid information when it does not fit their preferred view? This seems like bad engineering, so why hasn’t this glitch been corrected?

Cognitive scientists have proposed an intriguing answer: our brain assesses new information in light of the knowledge it has already stored, because in most cases that is, in fact, the optimal approach. More likely than not, when you encounter a piece of data that contradicts what you believe with confidence, that piece of data is in fact wrong. For example, if I told you I had observed a pink elephant flying in the sky you would assume I was either lying or delusional, as you should. It is a reasonable strategy, but it also means that confidently-held opinions are difficult to change.

Get our weekly newsletter

  • Sign up for CNN Opinion’s new newsletter.
  • Join us on Twitter and Facebook

    They are even more difficult to change once people act on them. Research has shown that immediately after making an overt choice (think voting for Donald Trump or Hillary Clinton), our conviction strengthens as we tend to rationalize our choices to ourselves and others.

    So while data is important for uncovering the truth, it is not enough for convincing people of that truth.

    We should not, however, be discouraged. The solution, I believe, is not to fight the way our brain works, but to go along with it. We should take our biases into account and use them when trying to convey our truth.

    How exactly can we do that?

    This is the first in a two-part op-ed by Tali Sharot. You can read part two here.