The Cognitive Shortcut That Clouds Decision-Making

Merely repeating false claims increases their believability, leaving business leaders vulnerable to basing decisions on misinformation. Here are four strategies to prevent this.

Reading Time: 12 min 

Topics

Permissions and PDF Download

Dan Page/theispot.com

Meetings are as effective over Zoom as they are face-to-face. A four-day workweek makes employees more productive. Few complaints means customers are happy. Innovation requires disruption.

Business leaders regularly confront these and similar claims. But what makes people believe that they are true? And, more critically, how do such claims affect strategic decisions?

We live in a time of unprecedented access to information that’s available anytime and anywhere. Even when we don’t actively seek out opinions, reviews, and social media posts, we are constantly subjected to them. Simply processing all of this information is difficult enough, but there’s another, more serious problem: Not all of it is accurate, and some is outright false. Even more worrying is that when inaccurate or wrong information is repeated, an illusion of truth occurs: People believe repeated information to be true — even when it is not.

Misinformation and disinformation are hardly new. They arguably have been impairing decision-making for as long as organizations have existed. However, managers today contend with incorrect and unreliable information at an unparalleled scale. The problem is particularly acute for Big Tech companies like Facebook, Google, and Twitter because of the broad societal effects of misinformation on the platforms. A recent study of the most-viewed YouTube videos on the COVID-19 pandemic found that 19 out of 69 contained nonfactual information, and the videos that included misinformation had been viewed more than 62 million times.1

In the world of corporate decision-making, the proliferation of misinformation hurts organizations in many ways, including public-relations spin, fake reviews, employee “bullshitting,” and rumormongering among current and future employees. Executives can find themselves on the receiving end of falsified data, facts, and figures — information too flawed to base critical decisions upon. Misinformation, regardless of whether it was mistakenly passed along or shared with ill intent, obstructs good decision-making.

All employees, from CEOs to front-line employees, consistently face the challenge of deciding whether a piece of information is true. This is not always an easy task — and it gets complicated by a strikingly banal but powerful bias in how we make sense of information. It’s a glitch of the human mind: We have a tendency to perceive repeated information as more believable than information we hear for the first time, regardless of whether the information is in fact true. Our judgments about the truth of a statement are influenced not only by the actual content of the information but also by our tendency to believe repeated information more than novel information.

Why Does Repeated Misinformation Ring True?

In 1939, U.S. President Franklin D. Roosevelt famously warned, “Repetition does not transform a lie into a truth.” Unfortunately, decades of research have shown that repeating false information can create at least an illusion of truth. Psychologists refer to this phenomenon as the repetition-based truth effect or, in short, the illusory truth effect.2 Information we hear again and again acquires a “stickiness” that affects our judgment, and the effects are neither fleeting nor shallow. The illusory truth effect is in fact extremely robust, and its power has been shown repeatedly across a range of domains, from political speeches to product claims. One unsettling finding is that people fall prey to the effect not only when the original information is still fresh in their memories but also months after exposure. In addition, people judge repeated information as truer even when they cannot recall having previously encountered the information.

Repeated information can be persuasive even when we “know better” — when false claims contradict well-known facts or come from an untrustworthy source. For example, a manager might know that an employee is a notorious gossip but could still be influenced by a rumor the employee spreads. This is because over time, content (the rumor) often becomes disconnected in memory from its source (the untrustworthy employee). It’s the well-known feeling of being sure that you have heard or read a piece of information before but being unable to recall where it came from. Some of our own research suggests that the illusory truth effect even persists when people are offered monetary incentives to make accurate judgments, such as when rewards are offered to employees. Enlisting a trusted expert to counter false information doesn’t help either; studies show that people believe repeated misinformation even when a reliable source argues that the information is incorrect.

While important organizational actions are typically based on a rigorous assessment of the available facts, the illusory truth effect can still influence people when they are gathering information and discussing the decision. For example, a team member might repeatedly but incorrectly argue that moving production to another country won’t hurt the company’s image. This alone might not be the force that drives the decision, but it could be one of the many pieces of information that contributes to the choice. Not every fact can be verified, especially in situations with some degree of uncertainty.

Why do people tend to believe repeated information more than new information? The following scenario shows how this commonly happens.

  • A manager looking to hire a new member of the sales team has short-listed two candidates with comparable profiles, Jane and Susan, and interviews are scheduled for Friday.
  • On Monday, a team member remarks that Jane is very knowledgeable about the company’s product lines. This is new information, and the manager’s mind makes a connection between the candidate “Jane” and the concept “knowledgeable.”
  • On Wednesday, the same team member again mentions that Jane is knowledgeable about the company’s products, reinforcing the existing connection between “Jane” and “knowledgeable.”
  • On Friday, both Jane and Susan say they know a lot about the company’s products. Because the information about Susan is new and the information about Jane is not, the connection between “Susan” and the concept “knowledgeable” is not as strong as the connection between “Jane” and “knowledgeable.” And because the repeated information about Jane feels more familiar than the new information about Susan, the manager processes it more easily. This ease with which we digest repeated information is termed processing fluency, and we use this as evidence of truth. The manager is more likely to hire Jane.

Given the capabilities of the human mind, it seems remarkable that people so easily accept information simply based on their repeated exposure to it. Yet consider: How do most people know that Tim Cook is the CEO at Apple? Or that bitcoin is volatile? It’s because they have encountered this information multiple times in the past. Repetition is central to how people learn and acquire knowledge, and it makes sense that information encountered repeatedly should be more credible. Experience teaches us that most of the information we are exposed to every day is probably factually correct, especially if we encounter it more than once. Accepting repeated information as true helps us navigate an increasingly complex information landscape. The mind has learned a functional shortcut that uses processing fluency as a sign that information is valid and accurate.

Because this shortcut is so efficient, we tend to overrely on it. But in today’s complex and uncertain information ecosystem, quick processing and repetition aren’t sufficient or useful for judging truth. Leaders must not only be able to distinguish facts from falsehood but — critically — also must guard themselves, their teams, and their organizations against being intentionally or unintentionally misled.

Four Strategies to Combat the Illusory Truth Effect

The illusory truth effect occurs effortlessly, but effort is necessary to combat it. While its negative effects can never be fully avoided, its influence can be limited through diligence and a focus on accuracy. This is especially critical for managerial decision-making, where believing inaccurate or false information can lead to biased hiring, firing, and promotion decisions; missed opportunities for growth; or flawed choices when venturing into new markets.

How can managers guard against these negative consequences? We’ve identified four strategies that can improve the likelihood that leaders and their teams will base their decisions on credible information.

Strategy 1: Avoid the bias blind spot.

Recent research shows that people high in intelligence or with superior analytical skills are just as prone to the illusory truth effect as everyone else.3 So the first thing managers can do is accept that they aren’t immune.

Too many decision makers fall prey to the bias blind spot, a well-documented psychological phenomenon where people believe that biases cloud other people’s actions but not their own.4 One of the authors of this article regularly teaches courses on leadership and decision biases to MBA students and experienced managers, and every year several attendees openly claim that they would never fall into such decision traps. However, when put to the test in a simulation designed to illustrate biases and their impact on critical leadership decisions, the same individuals exhibit the same biases to which they claimed to be immune.

Nobel laureate Daniel Kahneman has said that if he had a magic wand to remove a single judgmental bias, he would eliminate overconfidence, since it is the one that makes people particularly resistant to the idea that they will fall prey to other biases. Understanding the illusory truth effect, and accepting that you are as vulnerable to it as anyone else, is a productive first step toward minimizing its dangers.

Strategy 2: Avoid epistemic bubbles.

Strategic decisions often depend on information shared by team members and other key stakeholders. It is of vital importance that this network does not constitute an epistemic bubble, where members encounter only similar opinions and don’t consider alternative points of view. Studies show that teams reflecting diverse perspectives outperform more homogeneous groups even when the latter have members with higher individual abilities on average.5

Consider this: The director of human resources hears from a team member that research shows that employees are more productive with a four-day workweek, and a week later, another member of the team says the same thing. Even though the decision to adopt a shorter workweek will require extensive analysis, research, and strategic meetings, the notion that such a change would improve productivity has been lodged in the director’s head and will become one part of their calculation in making the decision.

The HR director needs to ask whether the sources of the information are truly independent and not merely repeating each other’s arguments. It is well known that team members spend too much time discussing information that many of their other team members already have instead of considering novel information held by only some. This has negative consequences if the novel information is critical for reaching an optimal solution.6

To avoid epistemic bubbles, managers should foster an environment in which opposing and differing perspectives can be generated and where they are actively and openly discussed. Is the team positioned to critically consider opposing ideas and counterproposals? When someone voices an opinion, would others dare to speak up if they disagreed? It’s worth remembering that the repetition of an argument by several like-minded people might make it psychologically more convincing but not necessarily more accurate or valid.

Strategy 3: Question facts and assumptions.

Much of the time, we accept new information that we would reject if only we thought a little harder. An accuracy mindset can help short-circuit the illusory truth effect with an emphasis on evaluating whether information fits with one’s knowledge, and it can promote a culture in which the default is to consider the truthfulness of new information when it arises.

This strategy might sound straightforward, but in fact few people routinely critically evaluate the information they are presented with — and they will share information without considering whether it is actually true. Developing an internal fact-checking capability can be a powerful antidote to the illusory truth effect. Recent research has shown that when someone is presented with new information, simply asking them whether it is true makes it less likely that the person will believe the false information when it is repeated.7

Managers can develop an accuracy mindset through a few simple practices. One is to explicitly inform their teams about the prevalence of the illusory truth effect and to highlight the importance of approaching new information critically. They should also encourage their team to always ask “Is this true?” or “Does this fit with what I know?” when faced with information, suggestions, and opinions. Further, managers can ask team members to justify their decisions and to provide details about the information on which they’re based.

While this practice is vital, there are many instances when teams do not have sufficient expertise about a topic and internal fact-checking isn’t enough. Managers also need to promote external fact-checking to ensure that information used in decision-making has been verified by a trustworthy source.

To foster external fact-checking, managers should ask whether there are objective data, facts, and figures to support the information under consideration, and whether there is information that contradicts it. It’s also important to question whether sources are reliable. Do they have hidden motivations underlying the information they provide? How has the information been gathered, and is this a satisfactory methodology?

While external fact-checking is a highly effective strategy, it takes time and can delay decisions. It’s also impractical to verify every piece of information. Managers should weigh the potential consequences of acting on false information with the costs incurred by validating it. When the stakes are high, it pays to go to greater lengths in checking facts.

Strategy 4: Nudge the truth.

Because information gains credibility with repetition, managers can nudge the truth by repeating true and relevant information.8 This tactic is especially relevant given that the ongoing COVID-19 pandemic has contributed to a high degree of uncertainty around business conditions and workplace policies. This has created an environment with a greater risk of repeated misinformation and disinformation.

To counter this, managers should be prepared to respond with facts and repetition, repetition, repetition. Decades of scientific research within psychology, rhetoric, and philosophy have shown the value of repeating an argument. It is especially powerful to restate claims verbatim rather than using different phrasing.

This strategy does hold a risk, though: The manager might be inadvertently repeating incorrect information. Before reinforcing a message, it is critical to evaluate the information being conveyed. Be aware of bias blind spots, maintain an accuracy mindset, perform external fact-checking, and don’t get stuck in an epistemic bubble.

In the digital age, information is power; it gives managers a competitive edge. Yet inaccurate or false information, if repeated often enough, can acquire an illusion of truth, placing executive decision-making at risk. One of the most important challenges leaders will face for years to come is preventing inaccurate data, false information, and pseudo-facts from compromising their companies’ integrity and success. The four strategies we have outlined above will help them achieve this and create lasting and sustainable value for companies and their clients.

Topics

References

1. H.O. Li, A. Bailey, D. Huynh, et al., “YouTube as a Source of Information on COVID-19: A Pandemic of Misinformation?” BMJ Global Health 5, no. 5 (May 2020): 1-6.

2. C. Unkelbach, A. Koch, R.R. Silva, et al., “Truth by Repetition: Explanations and Implications,” Current Directions in Psychological Science 28, no. 3 (June 2019): 247-253.

3. J. De keersmaecker, D. Dunning, G. Pennycook, et al., “Investigating the Robustness of the Illusory Truth Effect Across Individual Differences in Cognitive Ability, Need for Cognitive Closure, and Cognitive Style,” Personality and Social Psychology Bulletin 46, no. 2 (February 2020): 204-215.

4. E. Pronin, D.Y. Lin, and L. Ross, “The Bias Blind Spot: Perceptions of Bias in Self Versus Others,” Personality and Social Psychology Bulletin 28, no. 3 (March 2002): 369-381.

5. L. Hong and S.E. Page, “Groups of Diverse Problem Solvers Can Outperform Groups of High-Ability Problem Solvers,” Proceedings of the National Academy of Sciences 101, no. 46 (Nov. 16, 2004): 16385-16389.

6. L. Lu, Y.C. Yuan, and P.L. McLeod, “Twenty-Five Years of Hidden Profiles in Group Decision Making: A Meta-Analysis,” Personality and Social Psychology Review 16, no. 1 (February 2012): 54-75.

7. N.M. Brashier, E.D. Eliseev, and E.J. Marsh, “An Initial Accuracy Focus Prevents Illusory Truth,” Cognition 194 (January 2020): 1-6.

8. C. Unkelbach and F. Speckmann, “Mere Repetition Increases Belief in Factually True COVID-19-Related Information,” Journal of Applied Research in Memory and Cognition 10, no. 2 (June 2021): 241-247.

Reprint #:

64120

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.