A Second Way of Reaching a Decision
Try to solve the following simple math problem: A baseball bat and ball cost $1.10 in total. The baseball bat costs $1.00 more than the ball. How much does the ball cost?
The problem is not really a math problem, it’s a psychological test. Because the answer, and our method of arriving at it, illustrates the limits of our intuition. If you are like most people, your immediate answer to the problem is the ball costs 10 cents. The majority of undergraduates at Princeton who were asked the question gave that answer. But that, of course, is the wrong answer ($0.10 + $1.10 = $1.20). The correct answer is the ball costs five cents. But to arrive at the correct answer, you probably had to pause for moment, for at least a beat, to think consciously rather than using your immediate intuition.
The fast and then the slow way of answering this simple math problem—each of which provides a different answer—illustrates that humans have at their disposal an additional method of thinking, a type of information-processing architecture other than intuition. Call it analytic intelligence or conscious decision making. This invokes rule-based decisions, nonsocial decisions that require abstract thought. This was the system you probably had to rely upon to answer the math question. Such thinking is time consuming, not immediate, and involves conscious reasoning. This analytical decision system is probably a more recent evolutionary adaption than intuition. Interpreting statistical information or probabilities, understanding legal arguments, and calculating interest rates call upon this more abstract type of reasoning.
Keith Stanovich is a psychologist at the University of Toronto who studies human development and reason. He is a world’s expert at researching the differences between both types of decision making—intuitive and analytical. His experiments look at the way children reach decisions in situations requiring use of probabilities and logic. In one experiment, children are given the task of trying to choose a white-colored marble from containers that have blue and white marbles in varying amounts. The containers vary in size. The biggest container has the smallest proportion of white marbles. Children relying only on intuition, rather than probabilistic reasoning, tend to choose the biggest container (with more “winning” white marbles but a smaller proportion of winners) as the likeliest place to find the white marbles, an incorrect answer. The general conclusion of his experiments is that cognitive ability is strongly associated with being able to override intuition and instead using the harder-to-access analytic system.
There are no widely agreed upon names for these two systems. Our intuitive, instinctive, automatic, experiential, heuristic, emotional, visceral, snap judgment–oriented, appetite-driven, or hot system is called by Stanovich “System 1.” The analytical, rational, reflective, deliberative, central-processing, or abstract system (all common words to describe it) is termed “System 2” by Stanovich. System 2 involves long drawn-out cognition. In System 1, answers are arrived at in milliseconds. Think of the distinction as answers that come from your gut (System 1) versus your mind (System 2).
Stanovich summarizes the difference between the two systems, and why this difference is important: “System 1 gives ballpark answers. But modern society requires precision beyond ballpark answers.”
When there is no evolutionary precedent for a problem, intuition isn’t going to cut it. Contemporary life is filled with situations and problems that must be dealt with both precisely and abstractly. The ability to decontextualize and think abstractly is more important than relying on social cues. Sometimes there are no social cues. Try arguing with your mutual fund after you’ve watched your 401(k) disappear in the stock market decline. Or try appealing to the common sense of your credit card company. In these circumstances, Stanovich points out, “We invariably find out that our personal experience, our emotional responses, our stimulus-triggered intuitions about social justice—all are worthless.”
Our intuitive system has not evolved for these abstract problems, which is why we have so much trouble selecting a 401(k) investment, knowing when to sell a stock, choosing the best health insurance plan, compounding interest rates, assessing the risks of complex mortgage-backed “structured products,” or dealing with probabilities and statistics in general. It’s why we believed real estate could only go up—or after the crash, only go down. It’s why investment bank CEOs, based on their decade-long success, began to think they could do no wrong. It’s why we hire the wrong employee who seemed so charming at the job interview. It’s why we underestimated the risks of credit markets and didn’t see growing possibility of a systemic meltdown.
The real problem is when we exclusively use our intuitive system to guide us in what are in fact abstract situations. Predictable biases arise. The answers aren’t even in the ballpark. Take, for example, the “gambler’s fallacy.” This is the belief that because a coin has come up heads many times in row, it is more likely to come up tails the next flip. But the coin has no memory. The outcome remains random, regardless of what happened in the past. Instead we remember, we imbue random outcomes with meaning, even with a sense of fairness, and mistakenly predict “tails.” (For the surprising situations where coin flips may not be random, and in fact are subtly biased, see Chapter 21, “The Truth About Coin Tosses: They Aren’t Fair.”)
And this sort of fallacy is also true in the way we view the stock market. We see patterns or think narratives are at work where none exist. We in effect socialize stocks, treating inanimate objects as if they had human characteristics. If we paid a certain amount for a stock, we think it’s only fair that we get at least that much back when we sell it, which might feel true, but has no bearing on the future direction of the price.
The dangerous intersection of our intuition and financial decision making has been studied in great detail by cognitive psychologists. Though not everyone is onboard Stanovich’s evolutionary framework just yet, the field has agreed there are systematic biases in our intuitive thinking, which I describe in detail later in the book. In general, these mental rules of thumb, known as heuristics, simplify decision making. Though useful when we have to think quickly, they can lead to predictable errors when more abstract analysis is called for.
You can also make the opposite mistake: using analytical intelligence to solve what are essentially instinctual problems. In one famous jam experiment (there are actually several famous jam experiments in decision psychology), a group of participants were asked to rate jams using abstract dimensions: color, consistency, mouth feel. Another group was just asked which jam they liked the best. The group trying to use complex cogitation was thrown off; they could not reach a good operational judgment of what made for a good jam, unlike those who just chose the jam they liked. The conclusion is, you shouldn’t over-think what you like in a jam.
Again, it is not every situation—very few in fact—where we need to override our intuitive system. Our intuition and abstract thought are not necessarily in conflict and may in fact support each other. However, in post-caveman life, there are many situations and decisions where our evolutionarily honed intuitive system is poorly adapted and doesn’t serve us well. Stanovich’s distinctions between the decision systems become crucially important in certain contexts where it can be extremely dangerous and self-injuring to be only guided by intuition.
The good news is you can teach yourself to check your intuition where necessary, to override your fundamental computational biases and use your analytical thought system instead. System 1 is our default, but through conscious effort, we can resist the temptation of listening to our gut, and instead make use of System 2. Airplane pilots do it all the time: They rely upon instruments to tell them if their plane is level rather than trusting their inner ears. The amazing fact is through learning and experience, cold, formal, analytical rules become automatic—second nature so to speak. If someone ever asks you again: “A baseball bat and ball cost $1.10 in total. The baseball bat costs $1.00 more than the ball. How much does the ball cost?” The correct answer of a nickel is now intuitive.