The Secret to Better Decision Making
How many times have you made what seemed like a great decision only to later realize how wrong you were? Don't lie. It’s happened to us all. It’s not that you’re stupid or a bad decision-maker. It’s that your brain is trying to simplify your information processing. These cognitive biases serve a purpose in our everyday lives. But they sometimes cause faults in our decision-making.
Mr. Wheeler’s Incredible Lemon Juice
“Clearly, the decision-making that we rely on in society is fallible. It's highly fallible, and we should know that.”
- Daniel Kahneman
In the Little Book of Stupidity, Sia Mohajer tells the story of McArthur Wheeler. In 1995, Mr. Wheeler decided to rob two banks. His fool-proof plan depended on his rubbing lemon juice on his face.
If you remember back to elementary school, we can use lemon juice as invisible ink. So, Mr. Wheeler reasoned that rubbing lemon juice on his face would render him invisible. He even tested his theory by snapping a picture of his lemon juice covered face with a Polaroid camera. And his selfie proved out his theory. When he looked at the pic, he couldn’t see his face.
Needless to say, Mr. Wheeler’s scheme failed.
By now, I imagine you’re shaking your head and thinking, “How could this guy be so stupid? How could he have taken a picture of his face and still thought he was invisible? There must be something wrong with this guy.”
Mr. Wheeler is an extreme example. But the truth is, we’re all prone to such errors. In this article, I’ll explain why. Next, I’ll go over three common cognitive biases. Finally, I’ll close with some tips to help you avoid Mr. Wheeler’s fate.
Let’s start with an explanation of cognitive biases.
Enter Cognitive Biases
“We have a very narrow view of what is going on.”
- Daniel Kahneman
Have you ever been in a discussion with someone and thought, “How can they not see that they’re wrong? How can they be so blind? So stupid?” It’s easy to see biases in others, but it’s much harder to see them in ourselves. As a result, we may overestimate our own intelligence. What’s worse, we may be resistant to changing our minds even when our bias becomes evident to us. The longer we have lived with a bias, the harder it is for us to move away from it.
Most of us think we're objective and logical when making judgments and decisions. If asked, we would say that we considered all the available, relevant information. And, as a result, came to the best possible conclusion. But the truth is, our brain, unbeknownst to us, filtered information. So, we weren't as objective as we thought.
We apply our own unique combination of knowledge and assumptions as we move through life. Every judgment and decision is subject to this unique personal construction of reality. And this personal reality is always changing based on our growing life experience.
Our brain can’t process the massive amount of information it takes in from our environment. So, there is a disconnect between reality and our perception of it. Because of this disconnect, our decisions and judgments are sometimes flawed. And the faster we make a decision, the larger the rift becomes and the greater the chance for error.
The History
In the early 1970s, Daniel Kahneman, a noted psychologist, and economist, along with Amos Tversky and others, established the cognitive basis for errors that come from bias and heuristics. What they found is that our brain takes mental shortcuts to estimate the possibility of uncertain occurrences. Our mind does so to simplify the task of processing information.
The problem is that sometimes, these shortcuts lead to errors.
What is a Cognitive Bias?
A cognitive bias is a systematic deviation from ordinary judgment. In each situation, we create our own reality from selective input. This causes us to not see the whole picture, but rather only limited portions of reality. So, we fail to see the entire situation. This limited personal reality, or "subjective-social reality," is what dictates our behavior. And causes us to make illogical inferences
In short, a cognitive bias is an error in our thinking. This error occurs because our brain simplifies its information processing.
Cognitive Bias vs. Logical Fallacy
A cognitive bias is different from a logical fallacy. Remember that a logical fallacy is an error in reasoning or a mistaken belief. But a cognitive bias is an error in judgment due to simplified processing or a shortcut.
Let’s say you test drive a new car. You like the car but decide to sleep on the decision. On the drive home, you notice that a lot of other people own the car. So much so that it seems like it's the only make and model anyone is driving. Since so many people own them, you reason, it must be a quality vehicle. That's an example of cognitive bias (confirmation and recency biases).
Same facts, but later at home, you see a commercial where your favorite athlete is advertising the car. You conclude that the vehicle is a quality car based on the athlete's endorsement. That's a logical fallacy (appeal to authority).
What is a Heuristic?
A heuristic is a method we use to come to a quick, but not always optimal, resolution. In other words, a heuristic is a rule of thumb, or a shortcut, for quick decision-making. We use heuristics when saving time is more important than making an optimal decision.
Imagine this. You're walking back to your car after having dinner and drinks with some friends. It's late, and you're tired, so you decide to cut through a dark alley. You see striding toward you, a large man with a hoodie pulled over his head, hiding his face.
You've heard several news stories in the days prior about a rash of muggings in another city. A sense of dread punches you in your gut, so you turn and get out of the alley. Taking a shortcut to your car is no longer appealing, so you stick to the well-lit streets for the rest of your walk.
This is an example of a heuristic we’re all familiar with, the availability heuristic. As the name implies, this heuristic operates on the idea of availability. In this heuristic, we draw immediate examples to mind when evaluating a situation. Sometimes we recall one bit of information easier than others. When we do, we tend to weigh that bit of information the heaviest in our decision-making.
In the above example, you heard about muggings in another city. Your brain then used that recalled information to make a snap judgment that the man is a danger.
Meanwhile, you ignore other relevant information. For one, you don't consider that the muggings are in a city hundreds, or thousands, of miles away. You also don't consider that it's a cold evening, so that could be why the man has his hood up. Finally, you don't notice that the man is on his phone. And that he's winded telling someone that he's running late. Thus, providing an explanation for his rapid walk.
Two Limited Resources: Time and Energy
You’re wondering: If cognitive biases cause us to make bad decisions, why do they exist? Is the Universe making us look like fools? The butts of some cosmic joke?
While I can’t speak for the Universe, I’m confident that it’s not out for a laugh at our expense.
So, why are cognitive biases a part of our daily lives? To answer, let's look at them from the aspect of two scarce resources: time and energy.
Time
How often do you hear the phrase, "Time is your most valuable resource?" Given the popularity of the hustle mentality and productivity hacks, it seems right. One of the reasons your brain takes the shortcuts we've been discussing is to save time.
Our brain will make a snap decision when saving time is more important than accuracy. One case is when we're in danger, such as the alley example from above. Another case is where we have a low-stakes outcome not worth spending extra time on.
Energy
The constant firing of neurons and our brain's housekeeping function take a lot of energy. Our brain accounts for about 20% of our total energy usage.
Saving energy may not seem like a high priority in a place like the modern United States. Since calories are available in ample quantities, we don't usually run into a shortage. But calories were scarce at points in our evolution. Under those conditions, energy conservation took on more importance. By taking shortcuts, our brain could reduce its cognitive load and save calories.
Understand Our Biases to Improve Our Decision-Making
“If individuals are rational, there is no need to protect them against their own choices.”
- Daniel Kahneman
Incorporating an understanding of biases into our mental models improves our decision-making.
Remember that cognitive biases allow us to make quick decisions. Doing so enhances our decision-making efficiency and reduces our energy usage.
Imagine how much time and energy you could waste in a mundane task like picking out a pair of socks.
You could consider the weather forecast, your favorite color, and many other criteria. So much that a simple task like picking a pair of socks could take hours and use up a tremendous amount of energy. Or you could save time and energy by taking a shortcut. For example, if you’re like me, going with whatever random pair you pull out of the drawer.
But, sometimes, those quick decisions distort our reasoning leading to bad decisions. These distortions are where we can improve.
So, by understanding cognitive biases, we can identify areas where we need to slow down to ask if we've considered all the relevant information. And if we need to seek out new information either through observation or by asking questions.
Examples of Common Cognitive Biases
Researchers have identified many cognitive biases. Understanding them all is a daunting task. Identifying and correcting them in our day-to-day life is near impossible. Since biases improve our decision-making efficiency, we don't always want to eliminate them. Sometimes we need to decide and move on.
But some decisions need more scrutiny. So, it’s good practice to be able to identify common biases. The following are three of the most common.
Confirmation Bias
If you’ve ever heard of a cognitive bias, chances are its confirmation bias.
Confirmation bias is our tendency to pick out information that confirms our preconceptions. It can manifest in two ways. First, we can gather or remember information selectively. Second, we can interpret information in a biased way.
Here's an example. Researchers at Emory University conducted a study. In it, they showed that Republicans were more likely to identify errors made by Democrats. The reverse was also the case.
Each group had preconceived notions about the other. These notions caused each group to pick out the inconsistencies of the other group. At the same time, each group paid less attention to the inconsistencies of their own group.
Fundamental Attribution Error
This is another common bias that causes us to make errors in judgment about others.
Fundamental attribution error is our tendency to judge ourselves and others differently. We judge others' behavior based on the person's disposition rather than the situation. We tend to do the opposite when judging our own behavior. We tend to put the situation before disposition.
Here's an example. You're late for work because your power went out the night before, messing up your alarm clock. As you're rushing to work, you spill coffee in your lap and run a stop sign. Doing so almost causes an accident. In this situation, you don’t criticize yourself based on your disposition. You're not stupid or irresponsible. Your poor driving is instead a result of the situation: the power outage and your hectic morning.
A few blocks later, another driver runs a stop sign and almost collides with you. In this case, you judge the other driver based on disposition. You fume under your breath that such a moron shouldn't have a license. Meanwhile, you ignore the situation that the other driver found herself in. She could be late for work also, or be rushing to a sick loved one’s bedside.
The fundamental attribution error causes us to judge others in a negative light. At the same time, we give ourselves the benefit of the doubt.
In the above example, you attribute the other driver’s poor behavior to a character flaw. And you attribute our own poor behavior to the situation.
Hindsight Bias
It’s some universal law that everyone’s circle of friends has that one guy whose favorite phrase is, “I knew it.” “I knew we should have stayed off the Interstate” or “I knew the Broncos would win.”
That's hindsight bias, or knew-it-all-along phenomenon, in a nut-shell. It's the tendency to believe that we knew more about an outcome after the event has passed than we did. It stems from our unreliability in recalling information. In this case, we can't remember how an uncertain situation appeared to us before it was resolved.
Hindsight bias can cause arguments. It can also cause us to second guess ourselves. Doing so hurts our confidence in our decision-making.
How Can We Improve Our Decision Making?
“The effort invested in 'getting it right' should be commensurate with the importance of the decision.”
- Daniel Kahneman
To begin, we must understand that we can’t end biases. We make too many decisions during the day to stop and second guess or analyze them all. As we discussed above, some decisions are too low-stake to warrant any further effort.
The key is to identify decisions where its critical to make an optimal decision. These are the situations where we should apply strategies to avoid biases.
For example, deciding what color socks to wear has low-stakes. The worse outcome is likely nothing more than some embarrassment. Deciding how to invest for retirement has high-stakes. It's a decision that impacts how you live in your later years.
There are several strategies we can use to overcome biases. Here are three that I find useful.
Understand the Situation
Remember that we can’t know a person until we walk a mile in their shoes. Putting ourselves in the other person's position can help us understand their thinking. It'll help us know why they did what they did. Doing so will help us avoid the fundamental attribution error.
Teach Yourself to be More Aware of Mistakes
Errors resulting from biases are all around us. From the barista to our boss to ourselves, everyone makes them. Like I mentioned above, not all mistakes are significant enough to matter. That doesn't mean you can't practice identifying biases.
Sometimes we need an optimal decision. In such cases, being able to identify your biases will help you make that optimal decision. Use a Checklist
Our brains aren’t up to the task of processing all the information it takes in. So, our human inadequacies are causing us to make errors.
In his book, The Checklist Manifesto: How to Get Things Right, Atul Gawande points out how common avoidable failures are. As humans, the volume and complexity of what we know can overwhelm us. This human inadequacy leads to preventable failures. So, what can we do to prevent such failures? The answer is the checklist.
A common cause for our mistakes is rushing through a task. A checklist can protect against missed steps. It can also ensure that we consider all the necessary information when coming to a decision.
Conclusion – How Can Everyone be so Stupid?!
Our brain can't process all the information that bombards it daily. This cognitive inadequacy leads to a disconnect between reality and our subjective reality. That disconnect increases the likelihood that we’ll make an error in judgment.
We're not stupid when we make such mistakes. Everyone is subject to biases. Even the smartest of us fall victim to them.
And, even though cognitive biases sometimes lead to errors, they aren't always bad. There may be situations that need quick thinking. To speed up processing, our brain takes a mental shortcut to make a judgment. Thus, saving us time.
These shortcuts also decrease our cognitive load, saving us energy.
There are several tactics we can use to ensure we make an optimal decision. Three of those are:
Understand the situation,
Teach yourself to be more aware of mistakes, and
Use a checklist.
We can’t avoid biases altogether. But understanding them will improve our decision-making when needed.
Bibliography and Additional Reading
Books
Mohajer, Sia. The Little Book of Stupidity: How We Lie to Ourselves and Don't Believe Others. Author, 2015.
Mauboussin, Michael J. Think Twice Harnessing the Power of Counterintuition. Boston, MA: Harvard Business Review Press, 2013.
Bennett, Bo. Logically Fallacious: the Ultimate Collection of over 300 Logical Fallacies. Sudbury, MA: Archieboy Holdings, LLC, 2017.
Online
For more information on Mr. Wheeler, see https://medium.com/@littlebrown/i-wore-the-juice-the-dunning-kruger-effect-f8ac3299eb1.
“Daniel Kahneman.” Wikipedia. Wikimedia Foundation, January 28, 2020. https://en.wikipedia.org/wiki/Daniel_Kahneman.
“Cognitive Bias.” Wikipedia. Wikimedia Foundation, February 6, 2020. https://en.wikipedia.org/wiki/Cognitive_bias.
“Availability Heuristic.” Wikipedia. Wikimedia Foundation, February 8, 2020. https://en.wikipedia.org/wiki/Availability_heuristic.
Cherry, Kendra. “How Cognitive Biases Influence How You Think and Act.” Verywell Mind. Verywell Mind, May 7, 2019. https://www.verywellmind.com/what-is-a-cognitive-bias-2794963.
Vinney, Cynthia. “How Cognitive Biases Increase Efficiency (And Lead to Errors).” ThoughtCo. ThoughtCo, October 31, 2018. https://www.thoughtco.com/cognitive-bias-definition-examples-4177684
Swaminathan, Nikhil. “Why Does the Brain Need So Much Power?” Scientific American. Scientific American, April 29, 2008. https://www.scientificamerican.com/article/why-does-the-brain-need-s/.
“Confirmation Bias.” Wikipedia. Wikimedia Foundation, January 31, 2020. https://en.wikipedia.org/wiki/Confirmation_bias.
“Confirmation Bias And the Power of Disconfirming Evidence.” Farnam Street, August 2, 2019. https://fs.blog/2017/05/confirmation-bias/.
“Fundamental Attribution Error.” Wikipedia. Wikimedia Foundation, February 2, 2020. https://en.wikipedia.org/wiki/Fundamental_attribution_error.
“Hindsight Bias.” Wikipedia. Wikimedia Foundation, February 5, 2020. https://en.wikipedia.org/wiki/Hindsight_bias.