Making Winning Decisions: Why It's Hard to Be Rational
You’ll never have all the information you need to make a decision. If you did, it would be a foregone conclusion, not a decision.
—Anonymous
Did you cancel any air-travel plans following the September 11, 2001, terrorist attacks? Did you avoid going into skyscrapers or attending events where masses of people might be? If you did, you were probably afraid of becoming a terror victim. But was fear of terrorism immediately following 9/11 a rational response? The answer is probably no,[1] but, as you’ll see in this chapter, it’s hard to be rational.
A rational person would be far more fearful of dying in his or her car than becoming a terrorist-initiated air flight casualty. In fact, it’s been calculated that terrorists would have to hijack 50 planes a year and kill everyone aboard before flying would be more dangerous than driving an equal distance. In spite of such statistics, people act irrationally to terrorism. Why? First, there’s substantial evidence that human beings are bad at assessing small risks of large catastrophes. Second, the actual risk of being a terror victim is not merely small. It’s unknowable. This makes any guess potentially valid and plays into the hands of media people and politicians who might benefit from panic. And third, it’s hard to be rational about the irrational. How can you understand or predict the behavior of a suicide bomber who is convinced he’ll become a hero in Heaven after killing you?
In spite of very good intentions, there are barriers that make it difficult for us to be rational.[2] Those barriers start with the unrealistic assumptions that underlie rationality:[3]
-
A clear and unambiguous problem. Rationality assumes that the decision maker fully understands the problem. In reality, problems are typically complex, with considerable ambiguity as to what’s cause and what’s effect. The result is that we often end up focusing on the wrong problem, confusing the problem with its symptoms, or ignoring or denying that there even is a problem.
-
All relevant criteria can be identified, as can all alternatives. In the real world, human beings are limited in their abilities to identify criteria and alternatives. We tend to focus on the visible and obvious. In addition, our biases and personal preferences tend to restrict a full and complete listing of options.
-
Criteria and alternatives can be ranked and weighted to reflect their importance. Because problems are typically complex, it’s often hard to rank and weight criteria and alternatives objectively.
-
There are no constraints in getting full information. Rationality assumes that we can get all the information we need to make a thorough and thoughtful choice. In reality, there are time and cost constraints that make it typically impossible to obtain full information.
-
Decision makers can accurately assess each alternative. Rationality presumes that the decision maker has full information about each alternative and that he or she will rely only on the criteria chosen, and the weights given to those criteria, in evaluating each alternative. In the real world, full information is never available. We also find it hard to limit analysis to only the criteria identified and to rate the criteria’s importance in proportion to the weights given. We often let irrelevant criteria or emotions influence our judgment.
Note
To minimize effort and avoid difficult trade-offs, we rely too heavily on experience, impulses, gut feelings, and convenient “rules of thumb.”
In addition to unrealistic assumptions, there are systematic biases and errors that creep into our decision processes and undermine rationality. They come out of our attempts to short-cut the decision process. To minimize effort and to avoid difficult trade-offs, we rely too heavily on experience, impulses, gut feelings, and convenient “rules of thumb.” In many cases, these shortcuts are helpful. However, they can lead to severe distortions from rationality. I’ll elaborate on many of these biases and errors later in the book, but here are some “coming attractions” of such biases and errors:
-
We don’t plan ahead. It’s hard for most of us to think long-term. As a result, we tend to react to momentary impulses, act inconsistently in pursuit of our priorities, and deviate from the direct path to our goals.
-
We’re overconfident. Most of us are overconfident about our knowledge and abilities. This leads us to do too little analysis of our options and to be too optimistic about our ability to select the best choice.
-
We rely too much on past experiences. Experience can teach us a lot, but it also tends to limit our thinking. It’s especially limiting when we’re faced with new or novel situations. Overreliance on experience tends to stifle the development of creative options.
-
We’re poor at learning from the past. Our memory is highly selective, and we’re pretty good at reinterpreting past experiences in ways that maintain or enhance our self-esteem. Therefore, we often don’t see problems when we should, and we’re unrealistic in assessing past “successes” and “failures.”
Rationality assumes that we can perfectly define a problem; identify all relevant criteria; accurately weigh all the criteria to reflect our goals, values, and interests; list all relevant alternatives; accurately evaluate and compare each of those alternatives; and select the best alternative. As noted before, the evidence confirms what you’ve always known—we’re not perfect. Between the naïve assumptions underlying rationality and the imperfections of the human mind, we all act irrationally at times.
Just because we all have difficulty being rational doesn’t mean we’re destined to constantly screw up the decisions we have to make. In spite of our limitations, many of us actually do a pretty good job of decision making. Part of the reason is that some people have learned the tricks to “managing irrationality.” By that I mean, they know their biases and have figured out how to minimize their impact. By the time you’ve finished reading this book, you should be a lot better at this task than you were before you began reading.
But even most people who know little about the psychology of decision making seem to get by on their limited knowledge. Why is that? First, the right or optimal choice is often obvious. Second, in many situations, a wide range of options will achieve an optimal, or nearly optimal, solution. And third, satisfactory solutions are often good enough.[4] In the real world, every decision doesn’t have dozens of choices, and, when they do, most are often clearly inferior. If you’re looking for a new television set, you’re likely to go to a Best Buy or Wal-Mart and check out their selection. After you impose your criteria—screen size, features, cost, and so forth—a single choice will often surface. What if this process ends up creating three or four choices rather than one? In many cases, any one of those choices will prove to be just fine because the differences between them are negligible. Finally, there is substantial evidence to suggest that many decisions can be solved by merely finding a satisficing solution—that is, one that is satisfactory and sufficient—rather than optimizing. We search until we find the first solution that meets all our criteria, and then we select it. In contrast to Sean Norris’ rational process for choosing a college, described at the opening of the last chapter, most of us satisficed. We identified criteria that we thought important in our college choice, developed a short list of possible options, and applied to the first school we found that was acceptable. For many of the decisions we face on a day-to-day basis, there is little to be gained by seeking an optimum choice. A satisfactory option proves good enough.
Unfortunately, “good enough” is often not good enough. When you think back about decisions that didn’t work out the way you had hoped, it’s often the result of having used inappropriate shortcuts and settling for a less-than-optimum choice. The remainder of this book will help you to understand both your unique preferences and biases and those commonly shared by all of us, and what you can do to improve your decision “batting average.”
-
You can reduce many of the biases and errors that undermine rationality.
-
Optimal choices are often obvious.
-
In many situations, a wide range of options will achieve a near-optimal solution.
-
Satisfactory solutions are often good enough.