How Assumptions of Consensus Undermine Decision Making

Reading Time: 26 min 
Permissions and PDF Download

In the early 1990s, a Fortune 100 company contemplated making a sizable investment to manufacture and distribute a core product in Asia. Although the project’s champion knew little about Asia, he was convinced he could succeed there just as he had in the United State. In making his judgment, he overlooked financial, operational and strategic information that contradicted his views. Senior executives, relying on the company’s U.S. experience, gave the go-ahead. After the resulting debacle and much soul searching, managers realized that they had let themselves be misled by their untested assumptions.

Such problems are not new, but in today’s world, they can be fatal. Rapid advances in information and production technologies have combined with global expansion and competition to create a business environment in which change is the norm.1

There’s nothing wrong with change. Classic management texts insist change is necessary for business survival and exhort executives to abandon their organizational isolationism — and their naive belief in environmental stability and homogenous, conflict-free workplaces.2 In dynamic internal and external business environments, leaders must be able to interpret cues and make decisions.3 But decision making is increasingly complex and success uncertain. Smart choices are often incompatible with existing knowledge and past experience, so managers may feel they are traveling without guideposts.4

Decision making is an art and a science, with no simple rules. To ager can handle an expatriate assignment, for example, a decision maker might need to use intuitive assessment in addition to analytic tools and research. Not surprisingly, increasing numbers of companies invest in programs to help managers improve intuitive judgment.

Although intuitive judgment has benefits, mounting evidence suggests that it often runs contrary to rational thinking, with managers’ confidence in their judgments and predictions far exceeding objective accuracy rates.5 Also, objectively irrelevant factors may influence choices. For example, some research shows that policy decisions based on numbers of jobs saved are often different from decisions based on numbers of jobs lost. Other research demonstrates that members of negotiating teams believe they have more-powerful bargaining positions than do solo counterparts, even when the only difference is the number of negotiators at the bargaining table.6 Most important, people who are unaware of the problems with intuitive judgment fail to compensate for it in their decision making.7

A key culprit in undermining intuitive judgment is social projection, also known as the false-consensus effect. Projection is the misperception of the commonness of one’s own beliefs, values, abilities and behaviors — usually in the direction of overestimating how common they are. Our analysis and field research have highlighted how projection has affected industries as diverse as consumer products, petroleum, manufacturing and professional services.

Tripping on the Path to Globalization

The Fortune 100 manufacturing company that blundered in a promising Asian market illustrates projection’s insidious effects. The project champion, a promising executive with significant domestic experience, debated the opportunity with staff, external consultants and investment bankers. Despite detailed analysis, the “go/no-go” decision was not clear-cut. Attractive financing was available; however, market data looked only marginally favorable, and political and cultural factors were unknowns. If anything, the bulk of the evidence suggested aborting the initiative. Experts questioned the size of the investment, the aggressiveness of the assumptions and the lack of familiarity with consumers and the labor market in Asia.

However, the project champion paid cursory attention to expert views not consistent with his thinking. He was sure that others really shared his beliefs and were merely being cautious. Ultimately, he advanced the project for approval because he believed “in his heart” that, with their good domestic record, the product and the management approach would work in Asia. Others’ databased objections faded; a consensus emerged that the project champion would be successful because he had been before. He got the go-ahead.

Asia proved different from the United States, however. Political and labor problems doubled the time required for construction of the manufacturing facilities. Inadequate infrastructure and external parties’ unwillingness to fulfill agreements thwarted distribution. Facilities finally were constructed, but productivity was below expectation. North American control systems created friction with employees, and Asian consumers did not take to the product as readily as anticipated.

In subsequent candid discussions, the project champion revealed that he knew the business case had been weak. But he argued, “We had a great product and a proven track record domestically. It seemed to me that all we needed to do was apply a little elbow grease.” He had made an intuitive judgment and assumed that his own beliefs, values and experiences were shared by others — projection at work.

He was not the only one in error. The members of the planning and investment committees also projected their values and experiences onto the Asian business environment. They selected the project champion to manage the venture because of his U.S. track record — even though they sensed that different managerial skills and values might be necessary. Unfortunately, problem-solving skills and individualistic values that worked in the United States proved an inadequate basis for selecting a manager who could succeed in Asia.

Understanding projection and its impact on decision making can not only explain inconsistencies and irrational choices, it can prevent the problems from occurring. When a psychological process operates automatically and outside managers’ awareness, a key step in combating its negative consequences is to develop insight.

When Assumptions Are Dangerous

Projection is a normal tendency. As human beings, we often misperceive the commonness of our beliefs, values and behavior. Most often, we overestimate the proportion of others who share our beliefs. So we come to see our own values and choices as relatively typical and appropriate — and we view alternative responses as unusual and even deviant.8 Because of our bias, we assume that the beliefs and behavior of others will be like ours. In organizations, such assumptions create barriers to successful globalization and change. (See “The Problems Projection Creates,”)

The Problems Projection Creates »

In one of the earliest demonstrations of projection, students’ estimates of the prevalence of cheating on exams differed significantly depending on whether or not they had first admitted to cheating. Both those who cheated and those who didn’t overestimated the commonness of their particular tendency.9 The same phenomenon occurs in managerial contexts. In a study of mock employment interviews, one of the authors found that recruiters’ beliefs about a candidate’s willingness to answer illegal interview questions correlated strongly with the recruiters’ own willingness to do so.10 Recruiters who would say no to requests for polygraph tests or to marital-status disclosures (which employers are proscribed from demanding) expected higher rates of refusal from others. More important, they made potentially inappropriate inferences about candidates who had an opposing view. A recruiter who would agree to take a polygraph test assumed that most candidates would also — and that a noncompliant candidate had something to hide. Recruiters who reported they would be willing to disclose their family status considered uncooperative a female candidate who refused to do the same.

What psychological processes underlie the perception that one’s own beliefs are common to most people? Researchers have identified five possibilities: surrounding oneself with similar others (selective exposure and cognitive availability); attending to one’s own views the most (salience of point of view); believing that one’s own behavior is based on the situation and that others’ behavior is based on their nature (causal attributions); filling in the gaps in ambiguous situations (situational construal); and needing to validate one’s own beliefs (motivation). (See “Why Projection Occurs.”)

Why Projection Occurs »

Surrounding Oneself With Similar Others

The earliest explanation of social projection combined what is known as “selective exposure” and “cognitive availability.”11 Selective exposure refers to people’s tendency to select friends, acquaintances, colleagues and advisers who share their backgrounds, interests, values and outlooks. It is not surprising that when consulting similar others, people hear their own views and conclude that almost everyone shares those views. Selective exposure creates a bias in their understanding of the variety of views others hold.12 Cognitive availability refers to the mental process by which people bring certain information to mind. Rather than conduct an exhaustive search, they search their memories for readily available information, which includes their own views and any data that support those views. Together, selective exposure and cognitive availability lead people, quite unknowingly, to underestimate the diversity of views and to overestimate the commonness of their own views.13

Consider this real-life scenario. A global consulting organization convened biannually a group of senior executives and valued managers for two days of operational meetings. Because individuals were nominated by their regional offices, it was understood that they represented the opinions of a local peer group. They advised the executives on how company policies affected individuals at the senior-consultant and manager levels. As the organization’s employee base diversified throughout the 1990s, a broader array of interests were discussed, including day-care facilities, nondiscriminatory work practices, elder care and general quality-of-life issues. Moreover, the previously all-white, male group began to include women and members of racial minorities.

The first time an African-American participated in the group was at a planning session one of the authors facilitated. To start, each of the managers presented issues of concern. When it was her turn, the new member said, somewhat nervously, that advancement within the firm was particularly difficult for members of minority groups. When she concluded speaking, everyone in the room looked expectantly at the senior executives. In a pleasant manner, the most senior executive said, “Based on my discussions and concern for this specific topic, I am pretty sure that it is not a problem in our firm.” The new manager seemed crestfallen, and the atmosphere in the room was uncomfortable for a moment, but soon the meeting proceeded as usual.

In private conversation with the author, the senior executive explained why he had handled the diversity issue as he did. He had thought deeply about the issue, trying to remember examples of problems in the company. None came to mind. He had consulted with others, and they had confirmed what he believed. He perceived that the topic made the managers in the room uncomfortable. He further explained that he had talked about the issue with his partners (white and male), with whom he felt “comfortable discussing [such] a sensitive topic” and that previous conversations with the managers in the room had given him the impression that they, like himself, were unwilling to discuss it. Because he relied on his memory for evidence and did not test his impressions with people different from himself (such as the managers in the room), he confidently maintained the misperception. In reality, many of the managers in the room did want to discuss the issue and were disappointed by the executive’s response.

In projecting his own beliefs onto others, the senior executive overlooked a potentially important issue, marginalized a promising manager and perhaps decreased the likelihood of hearing from her again. He also inadvertently signaled to the other managers that executives were not sincere in seeking input on certain issues.

The executive responded that way because of his selective exposure to partners who reinforced his viewpoint and led him to conclude that his beliefs were shared by an even broader population (the managers in the room and beyond). Selective exposure made him deaf to contrary voices.14

Attending to One’s Own Views the Most

Paying attention to beliefs that are highly salient — one’s own beliefs, for example — also contributes to projection. Unlike secondhand information, our own knowledge and experiences are rich and vivid. That vividness captures our attention and biases both the information we gather from others and our judgments. Alternative views and options recede. Research shows that when people consider a particular course of action, their perception that others would agree with their conclusion increases, and their perception of the commonness of differing beliefs diminishes.15 As a result, they are likely to disregard less vivid data that is relevant but contrary.16 Other research shows that the more that people think about their beliefs and position, the more pronounced the bias becomes.17

Consider a regional bank we’ll call Stalwart Savings and Loan. Stalwart’s seven senior executives, having completed their annual off-site strategic-planning cycle with one of the authors, were preparing for a board meeting. To the author, the planning meetings had seemed to run smoothly. The principal concern had been a possible acquisition. The CEO had favored it and frequently observed that Stalwart’s chairman expected to fuel growth through acquisition. Some of the executives were initially skeptical, but gradually they were persuaded. The planning process appeared to have united the group.

The CEO was elated that few dissenting voices had been raised. He indicated that in the past each individual had negotiated merely for the good of her or his respective department, and he was glad he didn’t have to meet individually with members to secure support this time.

However, other points of view surfaced. Many senior executives expressed frustration that the CEO had brushed aside any objections to the acquisition. In fact, the chief financial officer was shocked, pointing to a thick binder of strategic and financial analyses indicating the acquisition was not the right way to proceed.

Subsequent meetings with the CEO revealed that he had been privately weighing the pros and cons of the acquisition for several weeks before concluding that it was right. Further, in a discussion prior to the planning session he had discovered that the chairman of the board was an advocate. So the CEO entered the planning process with a clear preference for an acquisition — and an image of how the successful strategy would unfold. He held steadfastly to that vivid personal vision despite others’ concerns. Moreover, he was convinced that his entire executive team supported the acquisition. He had no idea he was relying on his own beliefs, was ignoring contrary ideas and was making a sizable commitment based on an inaccurate perception of his team’s support.

Believing One’s Own Behavior Is Based on the Situation and Others’ Is Not

The process of understanding and attributing causes of behavior is another driver of projection. Specifically, projection is greater when people consider that their own beliefs and behaviors are arising from the situation (situational factors) rather than from their disposition (personal factors).18 Personal factors are stable characteristics of an individual, whereas situational factors emphasize the external circumstances in which the individual operates. Attributing their decisions to the realities of a situation, executives assume that most others would respond similarly.

Although individuals maintain that their own behavior is determined by the situation, they usually believe that the behavior of others is based on personality or disposition (actor-observer effect). Thus when two managers disagree, each believes the other’s dogged persistence is attributable to stubbornness but that his or her own persistence is a sensible response that other reasonable managers would likely support.

That thought process is particularly prevalent when the same information leads to different recommendations; it frequently emerges in new cross-functional teams focused on product development or organizational redesign. Such temporary teams generally bring together people from diverse functions who are preoccupied with the effect a decision could have on their department.

When one person disagrees with a recommended course of action because of its perceived effect on his or her department, others often fail to consider that person’s situation, assuming instead that the dissension springs from a contrary disposition. Believing their view is shared by other team members, they become increasingly certain that if the dissenter were rational and working in the organization’s best interests, he or she would not disagree. Causal attribution keeps teams from being effective — a dangerous scenario for critical strategic initiatives with tight deadlines.

Filling in the Gaps in Ambiguous Situations

Some researchers offer what is called the “situational construal” explanation for projection. They point out that most social situations are ambiguous and lead people to fill in the gaps from whatever personal knowledge they have. Such a cognitive process is responsible for multiple interpretations of the same event. People are unaware that they are subject to the process — or that divergent and valid interpretations may be possible.19

In a 1990 report in the Journal of Personality and Social Psychology, Thomas Gilovich showed the powerful effect of such construal on perceptions of consensus. In one study, he asked college students about their preference for 1960s or 1980s music and told them to estimate the percentage of their peers who would make each choice. Students were then asked what specific music they had in mind. Students who expressed a preference for 1960s music provided more appealing examples of 1960s musical groups; those who preferred 1980s music listed more appealing 1980s musical groups. Their estimates about consensus among their peers mirrored their own preferences. Next, their specific interpretations were presented to a second set of students, whose preferences and consensus estimates closely corresponded to those of the original group. In other words, once the situation had been interpreted for the second group, the students’ responses were predictable.

One of the authors observed a supplier-relationship renegotiation that illustrates the point. The European office of a Brussels-based U.S. multinational corporation and its French supplier spent several days discussing costs, product packaging, quality standards, distribution processes and related issues. Throughout the negotiation, members of the Brussels team consistently referred to “our special relationship” when discussing possible pricing arrangements and future research-and-development efforts. The author decided to interview each side separately to understand the precise meaning of “special relationship.”

When asked to define the term, the Brussels-based team leader smiled brightly. He explained that he was using the phrase to introduce the proposition that the companies create a single-supplier relationship. He detailed the presumed benefits to both companies, noting that the French company ultimately would be required to open its books. He also expressed pleasure that the French team seemed willing to entertain such a relationship. “Whereas I expected initial hesitation,” the team leader stated, “no opposition appeared to exist.”

The French had a different understanding of “special relationship.” At first perplexed, they finally decided it meant that the Brussels team valued the long-term relationship and the quality of the French product. The French team expected to continue with its favorable arrangement — one that enabled it to pursue other partnerships.

Each group relied on its own construal of the vague phrase “special relationship” and projected its own understanding onto the other party. Not surprisingly, a misunderstanding ensued. When the Brussels team got specific about being exclusive and opening the books, it saw the French respond coolly and realized something was awry. Each party viewed the other’s reactions as surprising and inappropriate; negotiations foundered and mutual trust suffered.

Needing To Validate One’s Own Beliefs

So far the drivers of projection have all been cognitive. They underscore what people pay attention to and with whom they consult. However, motivational factors, such as the desire for social acceptance or the need to maintain self-esteem, also play a role in projection and may work independently of the cognitive drivers.20 Motivation concerns people’s needs, and many people have an unconscious need to believe that their views are common, acceptable or “normal.” By assuming that their views are more common than they really are, people validate their own beliefs.21

Research shows that overestimating consensus for their positions helps individuals with minority views bolster their self-esteem.22 Those whose identity is threatened by receiving negative feedback comprise another group that overestimates others’ support.23 According to the motivational explanation for projection, newly arrived executives who are unsure of their standing — and executives whose intuitive ability, not their quantitative ability, has led to their rise through the ranks —will overestimate consensus.

Note that in discussing projection, we are careful to avoid saying two things. We are not saying that projection always leads to disaster. In fact, some studies show that when objective information is lacking, people who project their own preferences onto others may be more accurate than those who do not.24 In one study, respondents making predictions about people who were highly similar to themselves used their own preferences as proxies and were more accurate than were those who did not rely on their personal preferences.25 Note also that when people have no belief about how others might respond, they are unlikely to engage in projection.26 In such cases, people understand the inappropriateness of using their own responses as proxies for someone else’s.

So people do not always indulge in projection. Sometimes the opposite occurs, with people perceiving “false uniqueness.” They underestimate the commonness of their own responses. Cases of perceived uniqueness are seen in people who are estimating abilities, rather than estimating beliefs or behaviors. For example, the same people who show a uniqueness bias regarding their abilities show a consensus bias regarding their beliefs.27 Not only do people underestimate the commonness of their abilities, but their bias about the uniqueness of a particular attribute increases as the attribute’s importance to them increases.

Overcoming the Negative Effects of Projection

Education alone cannot protect executives from the vagaries of projection. Counteracting the bias requires concerted individual and group efforts. Training that explains the nature of the bias, even when combined with decision-making projects immediately followed by feedback, is not enough.28 Training may reduce projection’s effect when a series of repetitive decisions occur in a narrow context, but training has not proved highly effective for decisions that are not routine — precisely the type that executives face. Counteracting the tendency requires developing an awareness of when the bias is likely to occur and the skills to combat it. (See “Solutions to the Problems of Projection.”)

Solutions to the Problems of Projection »

Create and Maintain Individual Awareness of Projection

Executives can benefit from becoming aware of their susceptibility to cognitive bias. Inferential errors are a natural part of being human; it is unwise to assume the susceptibility applies only to others. To create awareness, managers must first accept the fact that most people — themselves included — overestimate the commonness of their beliefs. That realization may help managers be more attentive to the diversity of opinion in their companies. Next, they must accept the fact that projection is difficult to overcome because it is the result of tacit mental processes. Attenuating biased decision processes takes practice.

At company planning sessions, the authors have created awareness of decision biases using relatively simple activities —for example, empirically demonstrating overconfidence and incorrect assumptions about others’ views. An experiential learning activity, such as having people champion an opposing viewpoint, has helped individuals realize they are not immune to projection. Combining lectures with experiential demonstration helps create an environment in which executives hold themselves individually and collectively accountable for maintaining an awareness of how the projection of their own beliefs onto others might affect the decisions they make.

With training, executives can learn to make statements such as “This is what I think is going on” rather than “We all agree that this is what we have to do.” Getting people to reduce reliance on their own judgments and seek out contrary evidence can improve decision making greatly. That principle applies to individuals’ decisions as well as groups’, but given the deep-seated nature of projection and the speed with which it occurs, it is helpful to have group members hold one another accountable.

Adopt a Different Perspective and Accept That It Is Valid

Adopting another perspective is one of the most effective ways of counteracting projection.29 It attenuates the natural inclination to underestimate the role of situational factors in others’ behavior. People often feel compelled to rebut challenges to their position rather than adopt an alternative perspective and assess it on its merits. Engaging issues receptively from a different perspective often brings to the surface relevant concerns that individuals consumed by their point of view cannot see.

A useful exercise involves adopting an adversary’s position and assuming it is legitimate. Generating a list of supportive arguments for the adversary’s view can help people recognize the bias in their assumptions and the merits of other positions. Once a manager begins to understand the situational arguments in support of another’s viewpoint, it is easier to reach consensus on a course of action.

Encourage Conflict and Disagreement

Interpersonal conflict often creates discomfort for people; however, it can hold the key to overcoming the negative consequences of projection. Executives need to challenge their assumptions and recommendations — and the assumptions and recommendations of others. One easy way is to enhance the diversity of perspectives represented in the company’s inner circle and require that members share their perspectives openly and constructively challenge one another’s views.

That takes considerable social skill in light of globalization and cultural diversity. The manner in which conflict is expressed differs greatly among different groups. In the United States, the process often involves challenging the status quo and envisioning competing choices. In other cultures, critical examination may take on more indirect forms.30 However it is done, creating the conditions for respectful disagreement and negotiation is crucial.

The authors frequently see a destructive dynamic in new-product-development teams or other temporary cross-functional teams. Antagonistic discussions erupt among members of different functional units, with people clinging to their own perspectives and attacking others’. In such cases, it is important to avoid attributing others’ behavior to personality problems and instead to focus on the information the others are using. Team members must take time to analyze their own information as well. In facilitated sessions, the authors encourage people to make explicit the knowledge they rely on when drawing conclusions about others. People need to ask themselves, What behavior have I really observed? Might I be mistakenly attributing intent because the other person has a different view?

Audit the Decision Process and Curb Collective Projection

Auditing key decisions and the process by which they unfold can help attenuate projection. Decisions in organizations are often made in stages, with ambiguous data quickly becoming unquestioned facts.31 To help ensure that key assumptions are grounded in data, decision makers should review them carefully.

The most effective audit is a facilitated group session that leverages each person’s perspective. Although it would be inefficient to analyze every decision, major decisions about employment and investments may merit additional investigation. Stepping back and checking the facts — and the process by which the facts were generated — can uncover projection and improve the quality of decision making. In general, groups are more likely to uncover important information if they discuss facts rather than opinions.

When collective review is not a viable option, individuals should reflect on decision processes by themselves. Decision makers can turn the spotlight on themselves and question the legitimacy of their recollections. Memory is largely a reconstructive process, which, despite common perception, bears little resemblance to a videotaped recording of one’s experiences. We all selectively encode and access information in memory and can reconstruct and confidently hold memories of events that never occurred.32

Memory is often reliable for recounting generalities and themes, but it is less adept at recounting specifics. More important, the process of retrieving specifics from memory is highly susceptible to influence. Managers should be aware of memory’s fallibility and malleability whenever they reconstruct people’s behavior from memory and use it to attribute intentions to others. Focused questions can help: Am I relying too heavily on data culled from my own memory and experience to the point of blinding myself to other data? Are there other possible interpretations of the situation? Did I entertain multiple outcomes or courses of action? Reflective questions linked to the different theoretical explanations for projection can raise self-awareness and minimize unanticipated outcomes.

The Way You Did It Last Time May Not Be the Best Way

The alternative most salient to people is usually the one they used most recently. When making decisions, executives often ask, What did we do last time? The salience of traditional or standard solutions may stifle consideration of alternatives. More important, tradition may be an especially faulty guide in the context of globalization and change. To decide what role it should play, a manager should ask, How well did tradition serve us in past decisions? Do the same tenets apply to the current scenario?

Also, managers should not be seduced by the salience of their own actions to the point that they overlook knowledge from others. Asking questions such as What did competitors do and how did their choices fare? helps ensure that decisions are grounded firmly in information relevant to the current scenario.

Disentangle Self-Worth and Consensus

According to the motivational explanation of projection, the reason people want to believe that others share their views is that they crave social acceptance. They also may convince themselves that they possess unique levels of certain abilities — say, knowing the best way to manage a software-development project or how to understand other people.

Managers should not abandon intuition; they merely should hold it up for questioning. Decision makers who realize that confidence in their skill is not widely shared may experience a threat to their self-worth. Nevertheless, they need to accept the possibility that their views may have less support than they thought. At the same time, they should not fall prey to insecurity or accusations of disloyalty against those with differing views. Disentangling self-worth and estimates of consensus will lead to greater realism in decision making.

Intuitive Judgment: Pro and Con

Increasingly, success rests on anticipating market trends and responding quickly.33 Executives must assimilate information and make decisions in chaotic environments — often without the benefit of sufficient practice, planning or research. Past experience may be unrelated to future actions, but managers have little time to conduct large-scale studies. Well-honed intuitive judgment is an executive necessity.

However, intuitive judgment is susceptible to biases such as projection. Understanding projection can help people see why they often perceive others’ decisions and actions as irrational or misguided. And it can help explain how a smart, well-meaning executive could decide to expand into Asia — confident of success — while perpetuating North American perspectives, policies and practices. Knowledge about projection also can reveal the causes of failed interpersonal relationships by helping managers develop insight into the workings of their own minds as they try to make sense of others’ actions.

References

1. G. Ledford, S. Mohrman, A. Mohrman and E. Lawler, “Large Scale Organizational Change” (San Francisco: Jossey-Bass, 1989); and D. Nadler, M. Gerstein, R. Shaw, et al., “Organizational Architecture: Designs for Changing Organizations” (San Francisco: Jossey-Bass, 1992).

2. R. Nolan and D. Croson, “Creative Destruction: A Six-Stage Process for Transforming the Organization” (Boston: Harvard Business School Press, 1995); R. D’Aveni, “Hypercompetition” (Toronto: Free Press, 1994); R. Moran and J. Riesenberger, “The Global Challenge” (London: McGraw-Hill, 1994); and C. Prahalad and Y. Doz, “The Multinational Mission” (New York: Free Press, 1987).

3. R. Daft and K. Weick, “Toward a Model of Organizations as Interpretation Systems,” Academy of Management Review 9, no. 2 (1984): 284–295; and K. Weick, “Transforming Management Education for the 21st Century” (paper presented at the Annual Meeting of the Academy of Management, Chicago, Aug. 22, 1999).

4. P. Vaill, “Managing as a Performing Art: New Ideas for a World of Chaotic Change” (New York: Jossey-Bass, 1989); P. Senge, “The Fifth Discipline” (New York: Doubleday Currency, 1990); and C. Handy, “The Age of Paradox” (Boston: Harvard Business School Press, 1994).

5. D. Messick and M. Bazerman, “Ethical Leadership and the Psychology of Decision Making,” Sloan Management Review 37 (winter 1996): 9–22; J.E. Russo and P. Schoemaker, “Managing Overconfidence,” Sloan Management Review 33 (winter 1992): 7–17; M Bazerman and H. Farber, “Analyzing the Decision-Making

Processes of Third Parties,” Sloan Management Review 26 (fall 1985): 39–48; M. Bazerman, K. Morgan and G. Loewenstein, “The Impossibility of Auditor Independence,” Sloan Management Review 38 (summer 1997): 89–94; and A. Tversky and D. Kahneman, “Availability: A Heuristic for Judging Frequency and Probability,” in “Judgment Under Certainty: Heuristics and Biases,” eds. D. Kahneman, P. Slovic and A. Tversky (Cambridge: Cambridge University Press, 1982), 163–178.

6. L. Thompson, E. Peterson and S. Brodt, “Team Negotiation: An Examination of Integrative and Distributive Bargaining,” Journal of Personality and Social Psychology 70 (1996): 66–78.

7. S. Brodt and L. Ross, “The Role of Stereotyping in Overconfident Social Predictions,” Social Cognition 16 (1998): 225–252; and L. Ross and R. Nisbett, “The Person and the Situation: Perspective of Social Psychology” (New York: McGraw-Hill, 1991).

8. L. Ross, D. Greene and P. House, “The False Consensus Effect: An Egocentric Bias in Social Perception and Attribution Processes,” Journal of Experimental Social Psychology 13 (1977): 279–301.

9. B. Katz and F. Allport, “Students’ Attitudes” (Syracuse, New York: Craftsman Press, 1931).

10. S. Brodt, “A Truly False Consensus Effect: Examining the Heuristic Value of Self-Knowledge in Employment-Interview Judgments,” working paper, Duke University, Fuqua School of Business, Durham, North Carolina, 1999.

11. Ross, “The False Consensus Effect,” 279–301; S. Sherman, C. Presson, L. Chassin, E. Corty and R. Olshavsky, “The False Consensus Effect in Estimates of Smoking Prevalence: Underlying Mechanisms,” Personality and Social Psychology Bulletin 9 (1983): 197–207; and G. Marks and N. Miller, “The Effect of Certainty on Consensus Judgments,” Personality and Social Psychology Bulletin 2 (1985): 165–177.

12. Sherman, “Smoking Prevalence,” 197–207.

13. Tversky, “Availability,” 163–178; and J. Krueger, “On the Perception of Social Consensus,” Advances in Experimental Social Psychology 30 (1998): 163–240.

14. R. Nisbett and L. Ross, “Human Inference: Strategies and Shortcomings of Social Judgment” (Englewood Cliffs, New Jersey: Prentice-Hall, 1980).

15. M. Kernis, “Need for Uniqueness, Self-Schemas and Thought as Moderators of the False-Consensus Effect,” Journal of Experimental Social Psychology 20 (1984): 350–362; and Marks, “The Effect of Certainty,” 165–177.

16. Krueger, “Social Consensus,” 163–240.

17. W. Crano, “Assumed Consensus of Attitudes: The Effect of Vested Interest,” Personality and Social Psychology Bulletin 9 (1983): 597–608; and W. Wagner and H. Gerard, “Similarity of Comparison Group: Opinions About Facts and Values and Projection,” Archives of Psychology 135 (1983): 313–324.

18. M. Zuckerman and R. Mann, “The Other Way Around: Effects of Causal Attribution on Estimates of Consensus, Distinctiveness and Consistency,” Journal of Experimental Social Psychology 15 (1979): 582–597; T. Gilovich, D. Jennings and S. Jennings, “Causal Focus in Estimates of Consensus: An Examination of the False Consensus Effect,” Journal of Personality and Social Psychology 45 (1983): 550–559; and Marks, “The Effect of Certainty,” 165–177.

19. T. Gilovich, “Differential Construal and the False Consensus Effect,” Journal of Personality and Social Psychology 59 (1990): 623–634; and Ross, “The Person and the Situation.”

20. J. Krueger and R. Clement, “A Truly False Consensus Effect: An Ineradicable and Egocentric Bias in Social Perception,” Journal of Personality and Social Psychology 67 (1994): 596–610.

21. L. Festinger, “Informal Social Communication,” Psychological Review 57 (1950): 271–282; and L. Festinger, “A Theory of Social Comparison Processes,” Human Relations 7 (1954): 117–140.

22. C. Wetzel and M. Walton, “Developing Biased Social Judgments: The False-Consensus Effect,” Journal of Personality and Social Psychology 49 (1985): 1,352–1,359.

23. S. Sherman, C. Presson and L. Chassin, “Mechanisms Underlying the False Consensus Effect: The Special Role of Threats to the Self,” Personality and Social Psychology Bulletin 10 (1984): 127–138.

24. R. Dawes, “Statistical Criteria for the Truly False Consensus Effect,” Journal of Experimental Social Psychology 25 (1989): 1–17; and Brodt, “The Role of Stereotyping,” 225–252.

25. Ibid.

26. D. Hilton, R. Smith and M. Alicke, “Knowledge-Based Information Acquisition: Norms and the Functions of Consensus Information,” Journal of Personality and Social Psychology 55 (1988): 530–540.

27. M. Alicke, “Global Self-Evaluation as Determined by the Desirability and Controllability of Trait Adjectives,” Journal of Personality and Social Psychology 49 (1985): 1,621–1,630; N. Tabachnik, J. Crocker and

L. Alloy, “Depression, Social Comparison and the False-Consensus,” Journal of Personality and Social Psychology 45 (1983): 688–699; and Kernis, “Need for Uniqueness,” 350–362.

28. Krueger, “Truly False Consensus,” 596–610.

29. Krueger, “Perception of Social Consensus,” 163–240.

30. S. Brodt and C. Tinsley, “The Role of Frames, Schemas and Scripts in Understanding Conflict Resolution Across Cultures,” under review at an academic journal.

31. R. Cross and A. Yan, “Planned and Emergent Structure: Process and Outcome of a Successful Reengineering Effort,” 1999, under review at an academic journal.

32. D. Schacter, “Searching for Memory: The Brain, the Mind and the Past” (New York: Basic, 1996); F. Bartlett, “Remembering: A Study in Experimental and Social Psychology” (London: Cambridge University Press, 1932); and E. Loftus, D. Miller and H. Burns, “Semantic Integration of Verbal Information into Visual Memory,” Journal of Experimental Psychology: Human Learning and Memory 4 (1978): 19–31.

33. G. Stalk and T. Hout, “Competing Against Time: How Time-Based Competition Is Reshaping Global Markets” (New York: Free Press, 1990); and M. Treacy and F. Wiersema, “The Discipline of Market Leaders: Choose Your Customers, Narrow Your Focus, Dominate Your Market” (New York: Addison Wesley, 1995).

ADDITIONAL RESOURCES

Related books for the reader interested in pursuing the topic of projection include James G. March’s 1994 “A Primer on Decision-Making: How Decisions Happen” and Thomas Gilovich’s 1993 “How Do We Know What Isn’t So? The Flexibility of Human Reason in Everyday Life” — both Free Press publications. Also recommended is “Decision Traps: Ten Barriers to Brilliant Decision-Making and How to Overcome Them,” by J. Edward Russo and Paul J.H. Schoemaker, a 1990 Fireside Press book. Two McGraw-Hill books are particularly useful: “The Psychology of Judgment in Decision Making,” by Scott Plous, published in 1993, and Lee Ross and Richard Nisbett’s 1991 “The Person and the Situation: Perspectives of Social Psychology.” Also, the Society for Judgment and Decision Making has a Web site at www.sjdm.org.

Acknowledgments

The authors are grateful to Joachim Krueger, Ed Freeman and an anonymous reviewer for their thoughtful suggestions.

Reprint #:

4227

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.