Is fake news creeping into your belief system?

If you’re like most people, you make thousands of decisions every day.

Sure, most of these decisions are subconscious and inconsequential – how will you put on your left shoe or right shoe first?

Of course, many decisions have different consequences. Of all the products and services on the market, which one will you buy? Which relationships get your focused attention? Of all the priorities in your life, where do you invest the most time and energy? And when it comes to your core philosophy – your views on faith, social and political issues – how do you navigate the sometimes bewildering maze of conflicting perspectives?

This voting season, we are reminded that such navigation is often complicated by political candidates and pontificators who insist on playing with facts quickly and loosely.

This conundrum is the subject of a thoughtful article in the MIT Sloan Management Review entitled “The Cognitive Shortcut That Clouds Decision-Making”. It is based on research by an international team of scientists specializing in how people reach conclusions that have different effects on how they live – and enjoy – their lives.

One of these researchers is Dr. Nadia M. Brashier, a Purdue University professor who focuses on how people come to believe things that aren’t true, from fake news to common superstitions.

Rodger Dean Duncan: What is it about repeated misinformation that allows it to appear to be true – even months later?

Nadia Brashier: As marketers seem to realize, information feels easier to digest every time we come across it. We mistakenly interpret this feeling of lightness as proof of truth. Just a single exposure can be strong, but perceived accuracy continues to increase with additional exposures. This is a powerful illusion that occurs among wise people, despite conflicting advice from trusted sources and even when we “know better”. Repeating a claim like “China has the largest economy in the world” might convince you, even though you know that the United States, not China, has the largest GDP.

In a recent study, I found that even paying participants for correct answers doesn’t stop them from relying on the feeling of ease.

Duncan: To explain how the mind takes shortcuts, use the term “processing fluid.” Tell us how this works.

more intrusive: Our brains are sensitive to how many difficulties we have in processing information. Repetition creates a sense of lightness that inflates all kinds of judgments. At the so-called false glory effectthe mere sight of non-famous names leads to the mistaken assumption that these names belong to celebrities.

Another example: if you keep encountering a product, it becomes liquid and people like it more as a result. Likewise, any strategy that makes a claim easy to digest, whether by repeating it or presenting it in a high-contrast typeface, also makes it seem more true.

Duncan: We live in an age where one person’s “fake news” is another person’s infallible truth. What causes (or enables) people to be so adamant about their beliefs, even in the face of conflicting data?

more intrusive: People sometimes rely on misinformation even after receiving explicit disclosure messages. This persistent impact can occur because people initially refuse to accept the new information. But another key issue is that our brains don’t “overwrite” wrong beliefs, even when we accept a correction. We simultaneously save both the original myth and its correction, and the latter often fades from memory. So it might be helpful at first to give people a detailed explanation of why antibiotics aren’t curing COVID – but just a few weeks later those same people may be asking their doctor for a prescription.

Duncan: They say avoiding the blind spot of bias is a good strategy to avoid the “illusory truth effect.” Please give us an example of this.

more intrusive: That biased blind spot refers to the fact that we recognize the prejudices of others more easily than our own. For example, people might assume that colleagues work hard for external incentives such as money, while claiming that they are personally motivated by internal incentives such as pride in a job well done. Almost everyone believes they are “less biased than average.” This can have negative consequences in the workplace, as people with a high blind spot are less likely to take useful advice.

Duncan: An “accurate mindset,” you say, can help short-circuit the effect of illusory truth. How does this work?

more intrusive: People don’t always spontaneously consider whether incoming information is true or false. You could instead focus on how interesting or entertaining a claim is. Fortunately, we can draw people’s attention to accuracy. Asking people to “behave like fact-checkers” when they first encounter an untruth protects them from an illusory truth later. They draw on their own knowledge rather than relying on the ease they feel reading repeated misinformation.

Duncan: Some people live in an echo chamber where they are only exposed to views they already hold. What can leaders do to encourage good cross-fertilization of perspectives among team members?

more intrusive: Diverse teams generate a greater number of original and useful ideas, but this brainstorming process doesn’t always translate to execution.

We all assume that our own beliefs and attitudes are typical and can see consensus where none exists. For example, hearing a colleague repeatedly suggest a competitive advertising campaign can give the false impression that everyone agrees with the strategy.

In addition, there can unfortunately be an argument with opposing viewpoints increase Polarization by emphasizing differences between groups. Strategies for minimizing this bias include ensuring that teammates have equal opportunities to express themselves and asking employees to see each other’s perspectives.

Duncan: Is false truth always a bad thing?

more intrusive: We learn from experience that there is a connection between repetition and truth. On average, we encounter the true version of a statement more often than one of its many possible falsifications. For example, you are more likely to hear the correct claim “Jeff Bezos is the founder of Amazon” than the false claims that Elon Musk, Bill Gates or Mark Zuckerberg founded the company.

Relying on fluency sometimes misleads us, but in general it’s an adaptive rule of thumb. In addition, we can use this cognitive shortcut to make I agree Information feels fluid — repeating verified facts about the coronavirus, for example, increases belief in them.

source

Leave a Reply

Your email address will not be published. Required fields are marked *