Thinking, Fast, and Slow has become one of my go-to behavioural science MUST READS’. It has transformed how I think about how I think!
This Edition Note: in 2017 having read Kahneman’s seminal works, Prospect Theory: An Analysis of Decision under Risk, Judgment under Uncertainty: Heuristics and Biases, & Judgment, Decision, and Rationality, I went on to read Thinking Fast and Slow by Daniel Kahneman and wrote the following summary.
I have updated this summary following Kahnemans works following his death last week March 27, 2024.
please note the added material at the conclusion of this article as the April 2024 edits to my original 2019 reflections on the book.
Overview
Thinking Fast and Slow by Daniel Kahneman: There is…. a compelling drama going on in our minds, a filmlike plot between two main characters with twists, dramas and tensions.
These two characters are the impulsive, automatic, intuitive System 1, and the thoughtful, deliberate, calculating System 2. As they play off against each other, their interactions determine how we think, make judgments and decisions, and act.
System 1
is the part of our brain that operates intuitively and suddenly, often without our conscious control. You can experience this system at work when you hear a very loud and unexpected sound. What do you do? You probably immediately and automatically shift your attention toward the sound. That’s System 1.
This system is a legacy of our evolutionary past: there are inherent survival advantages in being able to make such rapid actions and judgments.
System 2
is what we think of when we visualize the part of the brain responsible for our individual decision-making, reasoning and beliefs. It deals with conscious activities of the mind such as self-control, choices and more deliberate focus of attention.
For instance, imagine you’re looking for a woman in a crowd. Your mind deliberately focuses on the task: it recalls the characteristics of the person and anything that would help locate her. This focus helps eliminate potential distractions, and you barely notice other people in the crowd. If you maintain this focused attention, you might spot her within a matter of minutes, whereas if you’re distracted and lose focus, you’ll have trouble finding her.
As we’ll see in the following book summary, the relationship between these two systems determines how we behave.
Thinking, Fast and Slow Key Idea #1: The lazy mind!
How laziness can lead to errors and affect our intelligence.
To see how the two systems work, try solving this famous bat-and-ball problem:
A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?
The price that most likely came to your mind, $0.10, is a result of the intuitive and automatic System 1, and it’s wrong! Take a second and do the math now.
Do you see your mistake? The correct answer is $0.05.
What happened was that your impulsive System 1 took control and automatically answered by relying on intuition. But it answered too fast.
Usually, when faced with a situation it can’t comprehend, System 1 calls on System 2 to work out the problem, but in the bat-and-ball problem, System 1 is tricked. It perceives the problem as simpler than it is, and incorrectly assumes it can handle it on its own.
The issue the bat-and-ball problem exposes is our innate mental laziness. When we use our brain, we tend to use the minimum amount of energy possible for each task. This is known as the law of least effort. Because checking the answer with System 2 would use more energy, our mind won’t do it when it thinks it can just get by with System 1.
This laziness is unfortunate, because using System 2 is an important aspect of our intelligence. Research shows that practicing System-2 tasks, like focus and self-control, lead to higher intelligence scores. The bat-and-ball problem illustrates this, as our minds could have checked the answer by using System 2 and thereby avoided making this common error.
By being lazy and avoiding using System 2, our mind is limiting the strength of our intelligence.
Thinking, Fast and Slow Key Idea #2: Autopilot!
Why we are not always in conscious control of our thoughts and actions.
What do you think when you see the word fragment “SO_P”? Probably nothing. What if you first consider the word “EAT”? Now, when you look again at the word “SO_P,” you would probably complete it as “SOUP.” This process is known as priming.
We’re primed when exposure to a word, concept or event causes us to summon related words and concepts. If you had seen the word “SHOWER” instead of “EAT” above, you probably would’ve completed the letters as “SOAP.”
Such priming not only affects the way we think but also the way we act. Just as the mind is affected by hearing certain words and concepts, the body can be affected as well. A great example of this can be found in a study in which participants primed with words associated with being elderly, such as “Florida” and “wrinkle,” responded by walking at a slower pace than usual.
Incredibly, the priming of actions and thoughts is completely unconscious; we do it without realizing.
What priming therefore shows is that despite what many argue, we are not always in conscious control of our actions, judgments and choices. We are instead being constantly primed by certain social and cultural conditions.
For example
Research done by Kathleen Vohs proves that the concept of money primes individualistic actions. People primed with the idea of money – for example, through being exposed to images of money – act more independently and are less willing to be involved with, depend on or accept demands from others. One implication of Vohs’s research is that living in a society filled with triggers that prime money could nudge our behavior away from altruism.
Priming, just like other societal elements, can influence an individual’s thoughts and therefore choices, judgment and behavior – and these reflect back into the culture and heavily affect the kind of society we all live in.
Thinking, Fast and Slow Key Idea #3: Snap judgments
How the mind makes quick choices, even when it lacks enough information to make a rational decision.
Imagine you meet someone named Ben at a party, and you find him easy to talk to. Later, someone asks if you know anybody who might want to contribute to their charity. You think of Ben, even though the only thing you know about him is that he is easy to talk to.
In other words, you liked one aspect of Ben’s character, and so you assumed you would like everything else about him. We often approve or disapprove of a person even when we know little about them.
Our mind’s tendency to oversimplify things without sufficient information often leads to judgment errors. This is called exaggerated emotional coherence, also known as the halo effect: positive feelings about Ben’s approachability cause you to place a halo on Ben, even though you know very little about him.
But this is not the only way our minds take shortcuts when making judgments.
There is also confirmation bias, which is the tendency for people to agree with information that supports their previously held beliefs, as well as to accept whatever information is suggested to them.
This can be shown if we ask the question, “Is James friendly?” Studies have shown that, faced with this question but no other information, we’re very likely to consider James friendly – because the mind automatically confirms the suggested idea.
The halo effect and confirmation bias both occur because our minds are eager to make quick judgments. But this often leads to mistakes, because we don’t always have enough data to make an accurate call. Our minds rely on false suggestions and oversimplifications to fill in the gaps in the data, leading us to potentially wrong conclusions.
Like priming, these cognitive phenomena happen without our conscious awareness and affect our choices, judgments and actions.
Thinking, Fast and Slow Key Idea #4: Heuristics.
How the mind uses shortcuts to make quick decisions.
Often we find ourselves in situations where we need to make a quick judgment. To help us do this, our minds have developed little shortcuts to help us immediately understand our surroundings. These are called heuristics.
Most of the time, these processes are very helpful, but the trouble is that our minds tend to overuse them. Applying them in situations for which they aren’t suited can lead us to make mistakes. To get a better understanding of what heuristics are and what mistakes they can lead to, we can examine two of their many types: the substitution heuristic and the availability heuristic.
The substitution heuristic is where we answer an easier question than the one that was actually posed.
Take this question, for example: “That woman is a candidate for sheriff. How successful will she be in office?” We automatically substitute the question we’re supposed to answer with an easier one, like, “Does this woman look like someone who will make a good sheriff?”
This heuristic means that instead of researching the candidate’s background and policies, we merely ask ourselves the far easier question of whether this woman matches our mental image of a good sheriff. Unfortunately, if the woman does not fit our image of a sheriff, we could reject her – even if she has years of crime-fighting experience that make her the ideal candidate.
Next, there is the availability heuristic, which is where you overestimate the probability of something you hear often or find easy to remember.
For example, strokes cause many more deaths than accidents do, but one study found that 80 percent of respondents considered an accidental death a more likely fate. This is because we hear of accidental deaths more in the media, and because they make a stronger impression on us; we remember horrific accidental deaths more readily than deaths from strokes, and so we may react inappropriately to these dangers.
Thinking, Fast and Slow Key Idea #5: No head for numbers
Why we struggle to understand statistics and make avoidable mistakes because of it.
How can you make predictions on whether certain things will happen?
One effective way is to keep the base rate in mind. This refers to a statistical base, which other statistics rely on. For example, imagine a large taxi company has 20 percent yellow cabs and 80 percent red cabs. That means the base rate for yellow taxi cabs is 20 percent and the base rate for red cabs is 80 percent. If you order a cab and want to guess its color, remember the base rates and you will make a fairly accurate prediction.
We should therefore always remember the base rate when predicting an event, but unfortunately this doesn’t happen. In fact, base-rate neglect is extremely common.
One of the reasons we find ourselves ignoring the base rate is that we focus on what we expect rather than what is most likely. For example, imagine those cabs again: If you were to see five red cabs pass by, you’d probably start to feel it’s quite likely that the next one will be yellow for a change. But no matter how many cabs of either color go by, the probability that the next cab will be red will still be around 80 percent – and if we remember the base rate we should realize this. But instead we tend to focus on what we expect to see, a yellow cab, and so we will likely be wrong.
Base-rate neglect is a common mistake connected with the wider problem of working with statistics. We also struggle to remember that everything regresses to the mean. This is the acknowledgment that all situations have their average status, and variations from that average will eventually tilt back toward the average.
For example, if a football striker who averages five goals per month scores ten goals in September, her coach will be ecstatic; but if she then goes on to score around five goals per month for the rest of the year, her coach will probably criticize her for not continuing her “hot streak.” The striker wouldn’t deserve this criticism, though, because she is only regressing to the mean!
Thinking, Fast and Slow Key Idea #6: Past Imperfect
Why we remember events from hindsight rather than from experience?
Our minds don’t remember experiences in a straightforward way. We have two different apparatuses, called memory selves, both of which remember situations differently.
First, there is the experiencing self, which records how we feel in the present moment. It asks the question: “How does it feel now?”
Then there is the remembering self, which records how the entire event unfolded after the fact. It asks, “How was it on the whole?”
The experiencing self gives a more accurate account of what occurred, because our feelings during an experience are always the most accurate. But the remembering self, which is less accurate because it registers memories after the situation is finished, dominates our memory.
There are two reasons why the remembering self dominates the experiencing self. The first of these is called duration neglect, where we ignore the total duration of the event in favor of a particular memory from it. Second is the peak-end rule, where we overemphasize what occurs at the end of an event.
For an example of this dominance of the remembering self, take this experiment, which measured people’s memories of a painful colonoscopy. Before the colonoscopy, the people were put into two groups: the patients in one group were given long, rather drawn-out colonoscopies, while those in the other group were given much shorter procedures, but where the level of pain increased towards the end.
You’d think the most unhappy patients would be those who endured the longer process, as their pain was endured for longer. This was certainly what they felt at the time. During the process, when each patient was asked about the pain, their experiencing self gave an accurate answer: those who had the longer procedures felt worse. However, after the experience, when the remembering self took over, those who went through the shorter process with the more painful ending felt the worst. This survey offers us a clear example of duration neglect, the peak-end rule, and our faulty memories.
Thinking, Fast and Slow Key Idea #7: Mind over matter
How adjusting the focus of our minds can dramatically affect our thoughts and behaviors.
Our minds use different amounts of energy depending on the task. When there’s no need to mobilize attention and little energy is needed, we are in a state of cognitive ease. Yet, when our minds must mobilize attention, they use more energy and enter a state of cognitive strain.
These changes in the brain’s energy levels have dramatic effects on how we behave.
In a state of cognitive ease, the intuitive System 1 is in charge of our minds, and the logical and more energy-demanding System 2 is weakened. This means we are more intuitive, creative and happier, yet we’re also more likely to make mistakes.
In a state of cognitive strain, our awareness is more heightened, and so System 2 is put in charge. System 2 is more ready to double-check our judgments than System 1, so although we are far less creative, we will make fewer mistakes.
You can consciously influence the amount of energy the mind uses to get in the right frame of mind for certain tasks. If you want a message to be persuasive, for example, try promoting cognitive ease.
One way to do this is to expose ourselves to repetitive information. If information is repeated to us, or made more memorable, it becomes more persuasive. This is because our minds have evolved to react positively when repeatedly exposed to the same clear messages. When we see something familiar, we enter a state of cognitive ease.
Cognitive strain, on the other hand, helps us succeed at things like statistical problems.
We can get into this state by exposing ourselves to information that is presented to us in a confusing way, for example, via hard-to-read type. Our minds perk up and increase their energy levels in an effort to comprehend the problem, and therefore we are less likely to simply give up.
Thinking, Fast and Slow Key Idea #8: Taking chances
The way probabilities are presented to us affects our judgment of risk.
The way we judge ideas and approach problems is heavily determined by the way they are expressed to us. Slight changes to the details or focus of a statement or question can dramatically alter the way we address it.
A great example of this can be found in how we assess risk.
You may think that once we can determine the probability of a risk occurring, everyone will approach it in the same way. Yet, this isn’t the case. Even for carefully calculated probabilities, just changing the way the figure is expressed can change how we approach it.
For example, people will consider a rare event as more likely to occur if it’s expressed in terms of relative frequency rather than as a statistical probability.
In what’s known as the Mr. Jones experiment, two groups of psychiatric professionals were asked if it was safe to discharge Mr. Jones from the psychiatric hospital. The first group were told that patients like Mr. Jones had a “10 percent probability of committing an act of violence,” and the second group were told that “of every 100 patients similar to Mr. Jones, 10 are estimated to commit an act of violence.” Of the two groups, almost twice as many respondents in the second group denied his discharge.
Another way our attention is distracted from what is statistically relevant is called denominator neglect. This occurs when we ignore plain statistics in favor of vivid mental images that influence our decisions.
Take these two statements: “This drug protects children from disease X but has a 0.001 percent chance of permanent disfigurement” versus “One of 100,000 children who take this drug will be permanently disfigured.” Even though both statements are equal, the latter statement brings to mind a disfigured child and is much more influential, which is why it would make us less likely to administer the drug.
Thinking, Fast and Slow Key Idea #9: Not robots
Why we don’t make choices based purely on rational thinking.
How do we as individuals make choices?
For a long time, a powerful and influential group of economists suggested that we made decisions based purely on rational argument. They argued that we all make choices according to utility theory, which states that when individuals make decisions, they look only at the rational facts and choose the option with the best overall outcome for them, meaning the most utility.
For example, utility theory would posit this kind of statement: if you like oranges more than you like kiwis, then you’re also going to take a 10 percent chance of winning an orange over a 10 percent chance of winning a kiwi.
Seems obvious, right?
The most influential group of economists in this field centered on the Chicago School of Economics and their most famous scholar Milton Friedman. Using utility theory, the Chicago School argued that individuals in the marketplace are ultra-rational decision-makers, whom economist Richard Thaler and lawyer Cass Sunstein later named Econs. As Econs, each individual acts in the same way, valuing goods and services based on their rational needs. What’s more, Econs also value their wealth rationally, weighing only how much utility it provides them.
So imagine two people, John and Jenny, who both have fortunes of $5 million. According to utility theory, they have the same wealth, meaning they should both be equally happy with their finances.
But what if we complicate things a little? Let’s say that their $5 million fortunes are the end-result of a day at the casino, and the two had vastly different starting points: John walked in with a mere $1 million and quintupled his money, whereas Jenny came in with $9 million that dwindled down to $5 million. Do you still think John and Jenny are equally happy with their $5 million?
Unlikely. Clearly then, there is something more to the way we value things than pure utility.
As we’ll see in the next book summary, since we don’t all see utility as rationally as utility theory thinks, we can make strange and seemingly irrational decisions.
Thinking, Fast and Slow Key Idea #10: Gut feeling
Why rather than making decisions based solely on rational considerations, we are often swayed by emotional factors.
If utility theory doesn’t work, then what does?
One alternative is prospect theory, developed by the author.
Kahneman’s prospect theory challenges utility theory by showing that when we make choices, we don’t always act in the most rational way.
Imagine these two scenarios for example: In the first scenario, you’re given $1,000 and then must choose between receiving a definite $500 or taking a 50 percent chance to win another $1,000. In the second scenario, you’re given $2,000 and must then choose between a sure loss of $500 or taking a 50 percent chance on losing $1,000.
If we made purely rational choices, then we would make the same choice in both cases. But this isn’t the case. In the first instance, most people choose to take the sure bet, while in the second case, most people take a gamble.
Prospect theory helps to explain why this is the case. It highlights at least two reasons why we don’t always act rationally. Both of them feature our loss aversion — the fact that we fear losses more than we value gains.
The first reason is that we value things based on reference points. Starting with $1,000 or $2,000 in the two scenarios changes whether we’re willing to gamble, because the starting point affects how we value our position. The reference point in the first scenario is $1,000 and $2,000 in the second, which means ending up at $1,500 feels like a win in the first, but a distasteful loss in the second. Even though our reasoning here is clearly irrational, we understand value as much by our starting point as by the actual objective value at the time.
Second, we’re influenced by the diminishing sensitivity principle: the value we perceive may be different from its actual worth. For instance, going from $1,000 to $900 doesn’t feel as bad as going from $200 to $100, despite the monetary value of both losses being equal. Similarly in our example, the perceived value lost when going from $1,500 to $1,000 is greater than when going from $2,000 to $1,500.
Thinking, Fast and Slow Key Idea #11: False images
Why the mind builds complete pictures to explain the world, but they lead to overconfidence and mistakes.
In order to understand situations, our minds naturally use cognitive coherence; we construct complete mental pictures to explain ideas and concepts. For example, we have many images in our brain for the weather. We have an image for, say, summer weather, which might be a picture of a bright, hot sun bathing us in heat.
As well as helping us to understand things, we also rely on these images when making a decision.
When we make decisions, we refer to these pictures and build our assumptions and conclusions based on them. For example, if we want to know what clothes to wear in summer, we base our decisions on our image of that season’s weather.
The problem is that we place too much confidence in these images. Even when available statistics and data disagree with our mental pictures, we still let the images guide us. In summer, the weather forecaster might predict relatively cool weather, yet you might still go out in shorts and a T-shirt, as that’s what your mental image of summer tells you to wear. You may then end up shivering outside!
We are, in short, massively overconfident of our often faulty mental images. But there are ways to overcome this overconfidence and start making better predictions.
One way to avoid mistakes is to utilize reference class forecasting. Instead of making judgments based on your rather general mental images, use specific historical examples to make a more accurate forecast. For example, think of the previous occasion you went out when it was a cold summer day. What did you wear then?
In addition, you can devise a long-term risk policy that plans specific measures in the case of both success and failure in forecasting. Through preparation and protection, you can rely on evidence instead of general mental pictures and make more accurate forecasts. In the case of our weather example, this could mean bringing along a sweater just to be safe.
Take Aways’
Thinking, Fast and Slow shows us that our minds contain two systems. The first acts instinctively and requires little effort; the second is more deliberate and requires much more of our attention. Our thoughts and actions vary depending on which of the two systems is in control of our brain at the time.
Actionable advice
Repeat the message!
Messages are more persuasive when we’re repeatedly exposed to them. This is probably because we evolved in a way that made repeated exposure to things that had no bad consequences are deemed inherently good.
Don’t be influenced by rare statistical events that are over-reported in newspapers.
Disasters and other events are an important part of our history, but we often overestimate their statistical probability because of the vivid images we associate with them from the media.
You’re more creative and intuitive when you’re in a better mood.
When you’re in a better mood, the part of the mind that is alert and analytical tends to relax. That cedes control of your mind to the more intuitive and quicker thinking system, which also makes you more creative.
April 2024: Reflecting on Kahneman’s Legacy and Final Contributions
As we revisit the profound insights of Daniel Kahneman’s “Thinking, Fast and Slow,” it is with a sense of deep respect and a touch of melancholy that we acknowledge the passing of a titan in the field of psychology and economics. Since my initial review in 2019, the relevance and impact of Kahneman’s work have only grown, underscoring the enduring value of his contributions to our understanding of human cognition, decision-making, and the interplay between the intuitive and the analytical.
In the twilight of his career, Kahneman continued to push the boundaries of knowledge, turning his attention to the systemic noise that affects decision-making processes in organizations and the potential of algorithmic interventions to improve judgment. His later work, particularly in collaboration with Cass Sunstein and Olivier Sibony on “Noise: A Flaw in Human Judgment,” offers crucial insights into the variability of human judgment and the ways in which it can lead to errors in critical decisions across various domains.
Kahneman’s dedication to exploring the nuances of how we think, decide, and act has left an indelible mark on the fields of economics, psychology, and beyond. His ability to translate complex psychological concepts into accessible and engaging narratives has not only enlightened scholars and practitioners but has also provided valuable lessons for the general public. His work continues to inspire a deeper, more nuanced understanding of the human mind and its biases, guiding us towards better decision-making both at an individual and organizational level.
As we reflect on Kahneman’s legacy, it’s clear that his contributions extend far beyond his seminal works. His intellectual curiosity, rigorous scientific approach, and commitment to improving human understanding have paved the way for future generations of researchers and thinkers. Kahneman’s passing is a significant loss to the academic community and to all who found in his writings a source of insight and inspiration. Yet, his legacy endures, fostering a continued exploration of the mysteries of the human mind.
In updating my review of “Thinking, Fast and Slow,” I am reminded of Kahneman’s unparalleled ability to challenge and expand our perspectives on rationality, intuition, and the complexity of human nature. As we mourn his loss, we also celebrate his life and contributions, which will continue to influence and inform our understanding of ourselves and the world around us for years to come.
Further reflections
I recently wrote a series of training and coarse-ware based on the following articles:
In closing, let us carry forward the spirit of inquiry and critical thinking that Daniel Kahneman exemplified throughout his life. His work serves as a reminder of the power of interdisciplinary research, the importance of questioning conventional wisdom, and the potential for scientific inquiry to enhance human welfare. Kahneman’s journey may have ended, but his ideas, like the legacy of any great thinker, will continue to provoke thought, inspire change, and shape the future of how we understand the architecture of the mind and the intricacies of decision-making.
Keep Reading
I like Daniel Kaheman’s work, I also really enjoyed reading David Epstein’s Range – Check out my review of Range, Why Generalists Triumph in a specialists world, here!