Thinking Fast and Slow | 11 Key learnings from this book

11 Key learnings from Thinking Fast and Slow Book

Thinking Fast and Slow

Thinking Fast and slow offers a description of the two main methods our brains use. Much like a computer, the brain is comprised of systems. System 1 is quick and intuitive. System 2 is emotional. Daniel Kahneman encourages us to abandon our dependence on this model. Systems 1 is the primary reason for stagnation and errors. System 2 is a less sluggish, more thoughtful, and rational thought process. Kahneman suggests tapping into this process more often. Alongside this suggestion, Kahneman provides guidance on the reasons and methods we use to make choices.

1. Systems 1 Is Innate

There are two different systems that are connected with our thinking processes. In each system, Kahneman exposes the fundamental tasks and decision-making methods that go with each.

System 1 encompasses all abilities that are inherent and commonly common to similar creatures in the realm of animals. For instance, each one of us has the ability to detect objects, focus our focus on important stimuli and be afraid of things that could lead to death or illness. System 1 also handles the mental processes which have become nearly innate through getting more efficient and automatic. The majority of these activities are transferred to System 1 as a result of long-term repetition. Certain knowledge points will become automatic for you. 

For instance, you will not have to consider what the capital of England is. In time, you’ve developed an automatic connection with the question, “What do you think is the name of the city in England as well as an intuitive understanding that system 1 has also dealt with the learned skills like reading an article cycling or reading a book. It also teaches you how to behave in everyday social situations.

There are other actions that fall under system 1 but could be categorized into system 2. This is the case in the event that you are making an effort to actively engage with the act. For instance, chewing can typically fall under system 1. However, suppose that you notice that you ought to be chewing foods more frequently than you have been doing. In that scenario, the chewing habits will shift to the more arduous system 2.

Attention is frequently linked to both systems 1 and 2. They operate in concert. For instance, system 1 will be controlling your immediate, involuntary response to an obnoxious sound. System 2 will be in charge and give you your attention in a voluntary manner to the sound and provide logical explanations about the cause of the sound.

The System 1 system is the filter through which you evaluate your experiences. It is the system that you employ to make intuitive decisions. It is, therefore, the brain’s oldest system since it’s evolutionary primitive. System 1 also is apathetic and is driven by impulse. Even though you may think it isn’t having an impact on your daily life, the system can influence the majority of your decisions and decisions.

2. System 2 can Manage Parts of System 1

System 2 is a variety of tasks. However, each one of them needs attention and can be disrupted when your attention is diverted. In the absence of focus, the performance of these tasks will be diminished. In addition, system 2 could modify the way system 1 operates. 

For instance, detection is usually an operation that system 1 performs. You can program yourself, by means of system 2 to look for an individual in the crowd. This system 2 priming lets your system 1 perform better, so it is more likely that you will locate the person you are looking for among the crowd. It’s the same procedure that we employ when conducting a search.

Since system 2 activities require concentration They are typically more demanding than activities in system 1. It can be challenging to perform two different system tasks. There are only a few tasks that are able to be simultaneously accomplished fall within the lower end of the spectrum like having conversations while driving. It is not recommended to have conversations while attempting to pass trucks in a narrow roadway. The more attention an activity requires and the more difficult it is to finish another task in the system.

System 2 is younger, it was developed in the last few millennia. System 2 has become more and more crucial in the wake of modernization and changing priorities. A majority of the activities of the second system require attention to detail, for example, providing a person with your number. The actions of System 2 are usually associated with the experience of choice, agency, and concentration. When we think about ourselves, we are a part of System 2. It is our conscious, rational self that holds beliefs, makes decisions, and decides on what we should think of and how to do it.

3. The Two Systems Work Together

Based on the descriptions of both systems It is easy to imagine that the two systems are a result of one another. Kahneman says that these two systems are actually interconnected and are mutually supporting. Thus, the majority of tasks fall under the two systems and complement. For instance, emotional reactions (system 1.) are essential in implementing the logic of reasoning (system 2.). Our decision-making is more effective and meaningful.

Another instance of two systems working together is when playing the game. Certain aspects of the sport will involve automatic actions. Think about a tennis match. It is a sport that requires running which is a natural human skill that is controlled by system 1. A ball’s hitting can be a system 1 task by practicing. But there will be certain strokes or choices that require system 2. Thus the two systems can be complemented to one another when you engage in a sport like a tennis.

Problems may arise when people rely too heavily on their system 1 because it is less work. Additionally, there are issues with actions that aren’t part of your normal routine. This is the time when systems 1 and 2 get in conflict.

4. Heuristics As Mental Shortcuts

The second portion of this book introduces readers to the idea of the concept of heuristics. Heuristics are the mental shortcuts we develop when we make choices. We constantly seek to resolve problems with the most effectiveness. Therefore, heuristics can be extremely efficient in conserving energy during our daily lives. For instance, our heuristics allow us to automatically apply our previous information to different situations. 

While heuristics are often beneficial but it is important to recognize that they are the reason behind the discrimination. For instance, you might have a negative encounter with someone belonging to one particular ethnic group. If you solely rely on your heuristics could be biased towards other members of the same ethnicity. Heuristics can also trigger brain biases that can lead to systemic mistakes in reasoning, bad decisions, or misinterpretation of things.

5. The Biases we create in our own minds

Kahneman discusses eight common biases and heuristics that could cause poor decision-making:

  1. Law of Small Numbers This law exposes our strongly held beliefs about smaller samples or those that resemble the same population from which they originate. Many underestimate the degree of variability that exists in small sample sizes. In other words, they underestimate what an insignificant study could accomplish. If a drug works for the majority of patients. How many patients will benefit from treatment if they are five? In actuality, from 5 patients there’s only a 41% probability that four patients will react.
  2. Anchoring when people make decisions that affect their lives, they tend to rely more heavily on prior information or on the first piece of information they encounter. This is called anchoring bias. If you first look at an item that is priced at $1200 and then you see another one for $100, you’re likely to overlook the second one. If you’ve just seen the second one that costs $100, you’d not think of it as to be cheap. The anchor – which was the first price you saw had an unintentional influence on your decision.
  3. Priming Our brains work through associations between words and objects. Thus, we are prone to be primed. A common belief is triggered by any event and guides us in a specific direction when we make our choices. Kahneman says that the concept of priming is the base for nudges and advertisements using positive images. For instance, Nike primes for feelings of achievement and exercise. 
    1. When beginning a new sport or trying to keep their fitness up customers tend to consider Nike products. Nike is a pro athlete’s brand and makes use of slogans such as “Just Don’t Give Up” to show the endurance and success of athletes. Another example: A restaurant’s owner who has excess Italian wine on hand, could convince customers to buy this kind of bottle by playing Italian music as background.
  4. Cognitive ease: Anything that is easy to comprehend for System 2 is more likely to be accepted as truth. It is due to the repetition of ideas, clear displays of a preconceived idea, and even one’s personal mood. In the end, the repetition of a lie could lead people to believe the idea, even though they know that it’s not true because the notion becomes a common one and simple to comprehend. 
    1. A good example is a person who is in a group of people who believe and even talk about fake information. While evidence suggests that this notion is not true but the ability to process this belief has made believing it simpler.
  5. Making assumptions without thinking: Kahneman suggests that our system 1 is a machine that operates by leaping to conclusions. This conclusion is based upon ‘What you see is what there is. In reality System 1 draws conclusions based on easily accessible and sometimes inaccurate data. When these conclusions are drawn, we trust them to the last. The impact that is measured of the halo effect, confirmation bias framing effects, as well as base-rate negligence are all aspects of making assumptions in the real world.
    • The Halo effect can be described as when you assign more positive attributes to an individual or thing by relying on the positive perception. For example, believing that a person is smarter than they really are due to the fact that they look gorgeous.
    • Confirmation bias is when you hold certain beliefs and search for information that confirms the conviction. Also, you avoid information that contradicts the beliefs. For instance, a detective might spot a suspect in the early stages of the case, but just want to confirm rather than disproving evidence. Filter bubbles, or “algorithmic editing” can increase confirmation bias on social media. 
    • The algorithms achieve this by presenting users with only posts and information they are likely to agree with instead of exposing them to different viewpoints.
    • The effects of framing are related to how an environment or problem affects people’s behavior. For instance, individuals tend to be cautious when frames that are positive are presented and are more prone to risk when the negative frame is shown. In one study in which penalties for late registration were implemented the number of Ph.D. applicants registered earlier. 
    • However, the number dropped to 67% after it was made available as a discount to encourage early registration.
    • In the end, base-rate neglect or base-rate fallacy is a result of our tendency to concentrate on the individuation of information, rather than base rate information. Individuating information is unique to a specific person or an event. Basis rate statistics are objective statistical data. 
    • We tend to attribute more significance to specific data and, in most cases, ignore base rate data completely. 
      • This means that we tend to base our decisions on individual traits instead of the generality of something that is general. This is an illustration of the fallacy that is called the base rate. There are situations in which there are a greater amount of fake positives than genuine positives. For instance 100 out of 1000 people tested negative for an infection however only 20 have the disease. 
      • The result is that 80 tests are false positives. The chance of positive results is dependent on many factors, such as the accuracy of testing and the nature of the population being tested. 
      • The frequency, which is the percentage of the population that suffers from the condition, could be less than the false positive rate of the test. In this scenario test results that have the lowest chance of generating a false positive in a particular situation can give many false positives than genuine positives all over. 
      • This is another example: even if one student you have in your Chemistry elective class looks and behaves like a traditional doctor, their likelihood that they’re studying medicine is very slim. It’s because medical programs typically have just about 100 to 100 people in contrast in comparison to thousands of pupils in other faculties, such as Business and Engineering. 
      • Although it’s possible to make quick judgments about individuals based on certain information we should not allow this to erase all baseline data on statistics.
  6. Accessibility: This bias happens when we use the significance of a particular event, recent event, or an experience that’s memorable to us, in order in order to arrive at our conclusions. Individuals who operate under System 1 are more susceptible to the availability bias than others. One illustration of this bias could be watching the news and then hearing that there’s been a massive plane crash in a different country. 
    1. If you were on an appointment the next week, you might be prone to a false belief that your flight would also go down.
  7. The Sunk Cost Fallacy: The Sunk-Cost Fallacy fallacy occurs when investors continue to invest more resources in a loss-making account, despite having better options accessible. For instance, when investors let the price at which they bought the stock decide the time they are able to sell it and when they can sell, they fall victim to the”sunk cost” fallacy. 
    1. The tendency of investors to sell successful stocks too early and keeping hold of losing stocks for too long has been studied extensively. Another instance is staying in a relationship for a long time despite it being emotionally destructive. 
    2. People are afraid to start over since it implies that everything they’ve done previously was worthless, however, this anxiety is often more damaging than letting go. This is the reason why people get dependent on gambling. To overcome this misconception, it is important to avoid the increase of a commitment to something that is likely to not succeed.

6. A Regression Back to the Mean

Regression towards the mean refers to the scientific fact that any set of trials eventually will converge to the average. Yet, humans tend to look for luckier and less fortunate streaks as indicators of what is to come in the future e.g. I’ve lost five pulls from a slot machine in a row, which means I’m due for a winning. This belief is linked to various mental flaws that Kahneman examines as follows:

  • The illusion of Understanding: We create narratives to help us understand the world. We search for causality when there is none.
  • The illusion of authenticity: Pundits as well as stock pickers and experts have a vast amount of their expertise.
  • Expert intuition: Algorithms that are applied with discipline usually outdo experts and their intuition.
  • Planning fallacy: This error is when people underestimate the positive results of an event based on the chance because they have planned for the event.
  • The optimism and optimism and Entrepreneurial Delusion The majority of people are confident, and often overlook competitors, and think that they are better than the average.

7. Hindsight Importantly influences Decision Making

Through a series of different factors, Daniel Kahneman shows how the past is not something that is understood. He focuses on hindsight, one of the biases that have negative effects on decision-making processes. In particular, hindsight alters the measurement used to evaluate the reliability of the decisions. This shift shifts the focus away from the actual process to the character of the final result. Kahneman says that actions that appear sensible in hindsight maybe recklessly negligent in the future.

The most common limitation for humans is the inability of us to accurately remember past experiences or the beliefs which have evolved. The bias of hindsight has a substantial influence on the judgments of decision-makers. It causes observers to evaluate how good a choice is not by the degree to which the decision was a good one, but rather by the fact that the result was good or not.

Hindsight can be particularly unkind to decision-makers who are agents for others, such as physicians and financial advisors and coaches for third-base CEOs, social workers diplomats, politicians, and diplomats. We tend to blame the decision-makers for making good choices that didn’t work out. We don’t give them enough credit for actions that are successful but are only apparent after the results. In humans, there is a distinct bias towards outcomes.

Though hindsight and outcome bias is generally a source of risk-aversion however, they can also give in undeserved rewards for reckless gamblers. One example is entrepreneurs who make risky bets and, in the end, are lucky enough to get lucky. Lucky leaders are not penalized for taking too much risk.

8. Risk Aversion

Kahneman observes that humans are at risk, which is to say that we prefer to steer clear of risk as much as we are able to. People generally avoid risks due to the risk of obtaining the least favorable result. If they are presented with the option of choosing between gambling and an amount equivalent to its anticipated value, they’ll go with the one that is guaranteed. 

This is determined by multiplying all possible outcomes by the probability each scenario will occur and then adding all those numbers. A decision-maker who is risk-averse will select the most certain option which will be less expensive than the anticipated amount of the threat. In essence, they are paying a cost to stay clear of risk.

9. Loss Aversion

Kahneman introduces the idea of loss fear. A lot of the choices we have in life involve a combination of risk and reward. There is the risk of losing money and a chance to gain. Therefore, we have to choose whether to take the risk or not.

Loss aversion is the relative power of two motives. We are more motivated to minimize losses rather than to make gains. The reference point may be the status quo. However, it could also be an aim for the near future. In the case of the example, not meeting an objective is an expense, but achieving the goal can be considered again.

The motivations for both are not the same. Aversion to failure is much stronger than the drive to reach an objective. Thus, people typically set small-scale goals that they aim to meet but don’t necessarily over. They may reduce their efforts after they’ve reached their immediate goals. The result can sometimes be contrary to the economic logic.

Kahneman further explains the way people value losses and gains rather than wealth. Therefore, the weights they place on outcomes differ from probabilities. When faced with a difficult choice, people are willing to take risks, assuming an increased chance of creating more problems in exchange for a tiny chance of avoiding a major loss. 

This kind of risk-taking frequently transforms manageable failures into catastrophes. Since defeat is difficult for the loser during wars typically fights beyond the point where victory is guaranteed to the opposing side.

10. Don’t Believe that your preferences reflect Your Personal Interests

In the context of the subject of decisions, Daniel Kahneman suggests that we all assume that our choices will be in our best interests. It is not always the scenario. Our memories, although not always accurate or interpret accurately, can significantly affect our decisions.

Choices that don’t result in the most satisfying experience can be not good news for people who believe in the wisdom of the choice. We are not able to trust that our decisions reflect our desires. The lack of trust is real even if the choices are founded on personal experience and recent experiences.

11. Our Memories Influence Our Choices

Memories shape our decisions. And, worryingly, they could be incorrect. The inherent inconsistency of our memories is built into the nature of our brains. Our minds have strong preferences regarding the duration of our experience of pleasure and pain. We would like the pain to be short and the satisfaction to endure. Memory, which is a part of System 1 has developed to capture the most intense moments in the experience of pleasure or pain. A memory that does not consider time will not satisfy our desire for pleasures that last long and quick painful moments.

A single value of happiness cannot accurately represent the emotions of a single moment or an event. While positive and negative emotions are present in equal measure, it’s possible to categorize the majority of moments of our lives as either positive or negative. The mood of a person in any given moment is based on their mood and general happiness. But, their overall happiness changes daily and regularly. The mood at the moment is mostly influenced by the situation at hand.

In Nutshell

Thinking fast and slow outlines the way in which humans function. We are a part of two different systems which are supportive of the other synergy. The problem is when we are dependent too much on our fast and impulsive system 1. This dependence leads to a variety of biases that adversely affect decision-making. The most important thing is to recognize the sources of these biases and then use the analytical systems 2 in line.

If you found my post helpful, then do share it with your friends and colleagues. If you have any feedback/questions, you may leave a comment below.

Click here to know more about me.

Leave a Comment