This is a word copy of an article that was published in Etc: A Review of General Semantics 73, October 2016, pp. 314-320. (The journal is a bit behind and this was actually published October 2018.) The purpose of the article was not to propose new theories but just to put some of these cognitive biases together and apply them to choice-making. And, as usual, the insights of General Semantics make a lot of confusing things a lot clearer.
Making choices is not an easy task. It regularly creates stress and regret. Everyone wants to make the right decisions or at least what we imagine the right choice might be.
The process is complicated and made less useful than might be because of a variety of cognitive biases that impair logical thinking and analysis and lead to errors of judgment, misevaluations, and bad choices. The trick is to identify the preferences and to confront them with more logical, more mindful, analysis.
Here we single out just five of the many biases (the ambiguity bias, the bandwagon bias, the anchoring bias, the confirmation bias, and the status quo bias), explaining how they operate, offering some examples, and proposing countermeasures you can take to help reduce the influence of these biases.
The Ambiguity Bias
All choices involve some degree of ambiguity—there is always some unknown factor that can alter the effectiveness of choice. The ambiguity bias leads you to select the option that is least ambiguous, the decision that is most certain (Ellsberg, 1961, 2001; Chew, Ebstein, & Zhong, 2012).
For example, assume you’re buying a used car, and you’re trying to decide between two very similar cars—one has 70,000 miles, and the other has a broken mileage counter and so may have fewer or more miles than 70,000. All other things being equal, you’d be more likely to choose the one with known mileage even if it’s high.
Consider a student selecting a course taught by two professors; one has average ratings on Ratemyprofessor.com, and the other is unrated. It’s likely that the student would choose the professor who has a score, even though it’s just average.
Or consider selecting a restaurant from the reviews on Yelp. Of the nearby choices, one has a 3-star rating, and the other is unrated. More than likely you’d chose the 3-star restaurant.
In all of these cases, you figure that the known average (or even below average) is better than the unknown which could be a lot worse. Of course, if the professor’s rating as horrible and the restaurant is given no stars, you’d likely go with the unknown. But, when things are at the acceptable level of tolerance, you’d probably choose the known.
It’s interesting to note in this connection that cultures differ widely in their tolerance of ambiguity (Hofstede, Hofstede, & Minkov, 2010). Persons from Singapore, Jamaica, Denmark, Sweden, Hong Kong, and other high ambiguity tolerant cultures are more tolerant of uncertainty and ambiguity while persons from Greece, Portugal, Guatemala, Uruguay, Belgium, and additional low ambiguity liberal cultures are less so and experience greater discomfort when situations are ambiguous. Students from low ambiguity tolerant cultures, for example, will want assignments that are clear and specific and will feel uncomfortable with ambiguous assignments; students from high ambiguity liberal cultures will respond in opposite ways.
The antidote to the ambiguity bias is—when possible—to reduce the ambiguity so that all choices can be examined more objectively. When that’s not possible—as in the case of the broken mileage counter—you’d need to make inferences based on other factors—the wear of the tires, the condition of the seats, the dings, and so on—that might help you better estimate mileage. And the student might reduce uncertainty by sitting in on a class or two with each instructor before making a decision.
The Bandwagon Bias
The term bandwagon seems to have been used, initially, about the wagon that carried the band in the circus. Later, politicians would use a cart with a group to promote their candidate and urged voters to jump on the bandwagon to support their candidate.
As a cognitive bias, the bandwagon bias refers to the widespread tendency to go with the crowd, to believe what others believe and to do what others do (Nelson, 2016). Going along with the group reduces the stress on choice making. Even if you make the wrong choice, you have lots of company, and that’s comforting. As a business leader, Lei Jun put it: “Things get much easier if one jumps on the bandwagon of existing trends.” But that’s obviously not always the most effective choice.
And, it’s not just the majority that influences us; it’s also the attractiveness of the other people. This attractiveness bias leads you to follow and to be affected by those you consider attractive. That is, you’re more likely to believe and act in the way beautiful people do than the way unattractive people do. It’s one of the reasons that clothing models are all above average in attractiveness. It’s also one of the reasons that political polls are so talented and so influential; you want to vote for and support a winner, not a loser and so polls may influence voters more than report voter preferences. In groups, this tendency is referred to groupthink, the bias to agree with the majority and do not voice opposition (Janis, 1983; Richmond, McCroskey, & McCroskey, 2005).
The antidote here is to recognize that the majority—however attractive--can be wrong. “Even when the experts all agree,” noted Bertrand Russell, “they may well be mistaken.” An excellent example of this is seen in the bystander bias where a group of people, even while witnessing a crime, may do nothing (Darley & Latané, 1968), a preference articulated after observing that a crowd of people did nothing while a woman was being murdered. More important, perhaps, is to recognize that other people are not you. They have different needs, wants motivations, objectives, talents, and so on. What is right for them—however, many or yet attractive—is not necessarily right for you.
The Anchoring Bias
The anchoring bias, as you can guess from its name, leads you to focus on just one bit of information about the choices and to ignore or give less importance to other factors. It anchors your thinking to one aspect; usually, it’s the first impression (McElhany, 2016; Dean, 2013).
Often it takes the form of anchoring decisions to what you’re used to or what you expect based on past experiences. For example, let’s say you’re a college student and have worked at a variety of fast food restaurants for $13 an hour. Now, you’re offered a job at $14 an hour, and so you take it. You make it because you’ve anchored your pay at $13 and the increase of a dollar is seen as a reason to take the new job.
The anchoring bias leads you to evaluate your choices according to some baseline. For example, in buying a house, the anchoring figure is the asking price, and it is around this asking price that negotiations will take place. The same is true for a car; the sticker price is the baseline; it’s the anchor.
The antidote here is to examine the negatives of your focus and the positives of the other aspects of the choice. Recognizing that the anchor is, in fact, an anchor and is getting in the way of your impartial examination of other decisions is likely to help reduce the effect of this bias. And, of course, you can always try to reset the anchor: I’m looking for a BMW 5 for X-amount; what can you do for me?
The Confirmation Bias
Umberto Eco once wrote that “followers of the occult believe in only what they already know, and in those things that confirm what they have already learned.” In reality, most people, though not occult followers, seek confirmation for their beliefs; they operate with the confirmation bias, a bias that leads you (sometimes consciously and sometimes unconsciously) to find evidence of your selected choice (Cherry, 2017). Whatever decision you want to be right will lead you to seek reasons why that is the right choice and why the other possible options are not as good. It leads you, in fact, to seek out confirming evidence and to ignore evidence that is disconfirming.
And so, if you decide you want to marry Chris, you’ll look for reasons why Chris would be an appropriate life partner. If your preconception is that German cars are the best, you’ll focus on evidence that supports this preconception—advertisements for Audi and Mercedes or talks with those who are pleased with their German cars. At the same time, you’d avoid ads for Lexus and Hyundai and dismiss or minimize any positive reactions from those who like cars that are not German.
The problem with the confirmation bias is that it leads you to limit your information search for confirming evidence and to avoid evidence that would disconfirm your preconceived choice. And, of course, it is precisely this disconfirming evidence that can lead to a more careful, thorough, and unbiased analysis.
The antidote here is to analyze your choice by actively seeking disconfirming evidence. If you decide to buy a BMW, look for evidence against this choice—you already likely have evidence as to why you should buy it; you need to balance that evidence with other evidence, especially contradictory evidence. A straightforward way to do this is to read positive reviews of different choices and negative reviews of your selected option. Listing—literally making a list—the negative aspects of your choice and the positive aspects of other opportunities, can help balance the evaluation. Taken too far, however, and you’ll never make a decision at all.
The Status Quo Bias
The status quo bias leads you to make decisions that essentially retain what you already have (Samuelson & Zeckhauser, 1988; Henderson, 2016). The status quo bias leads you to choose the familiar over the unfamiliar. The well-known adage—Better the devil you know, than the devil you don’t know—captures the status quo bias and also supports the ambiguity bias where you prefer the known to the unknown.
In many cases, you make a decision to not make a decision or not to change and just stick with what you have. The emotional advantage here is that not making a choice doesn’t seem like making a choice, and so it enables you (1) to avoid the stress of making a decision and (2) to prevent the regret that follows many choices.
Not surprisingly, you’re more likely to want to remain with the status quo when there is an overabundance of choices. When there are too many possible choices, the entire process can appear too confusing and too complicated. And here the status quo feels a lot more comfortable than going through the process of examining all these potential choices and, perhaps, making a mistake in the process.
Examples of the status quo bias are all around us—retaining your current insurance without looking for less expensive policies or retaining your cell phone service or simply renewing without examining alternative plans. The status quo bias also seems a likely reason why so many unhappy couples stay together. It’s easier to remain with the status quo and endure the unhappiness.
The status quo bias is closely related to another natural tendency, and that is the risk avoidance bias; you want to avoid risk. Because you’re probably like most people and risk aversive, staying with the status quo enables you to avoid losing something—a partner, money, a job, for example. Even though a change may well bring additional benefits, it may also lead you to incur a loss and losing (say, money) is more distressing and unpleasant than gaining money is enjoyable Ellsberg, 1961, 2001). The potential benefits of a decision to change are seen as less critical, less consequential than the potential downsides, or adverse effects of a decision to change that may turn out to be a poor one.
Another closely related bias is the omission bias. The negative impact of a wrong choice is less if it was a choice to do nothing and more if it was a choice to do something, that is, to change the status quo (Schwartz, 2016). So, for example, the fear of a child becoming ill leads many parents do not have their children vaccinated for any of a variety of viruses. If the child does get sick, the negative impact would be more significant if some action was taken than if no action was taken. One of the problems with this way of thinking is that the weight of the evidence is clearly on the side of doing something (that is, getting vaccinated) and against doing nothing (that is, not getting vaccinated). Yet, the omission bias persists (Ritov & Baron, 1990).
Recognizing that you may be influenced by this bias—bringing it to a mindful state—will help reduce its effects. Perhaps it will also help to recall Ronald Reagan’s observation that “status quo, you know, is Latin for ‘the mess we’re in.’” Or, equally appropriate, is leadership theorist Warren Bennis’ claim that “the manager accepts the status quo, the leader challenges it.” Visualizing what things would be like if changes were made may also prove of value.

0 Comments