
Introduction:
Imagine arriving in a new city and encountering a heavy crowd during your commute. Now, imagine watching a crime series set in that city. In both scenarios, we arrive at our conclusions without analyzing deep about other factors. We do so using mental shortcuts known as heuristics.
To navigate our daily lives efficiently, we need heuristics to solve complex problems with less time and effort (The Thinking Fast Part). But these heuristics can lead to systematic errors (biases), which is why we need its other counterpart (The Thinking Slow Part). Life, in essence, is an interplay between these two cognitive processes.
Two Selves:
Our brain operates with two distinct systems, almost like two characters or selves. System 1 represents the intuitive, fast-thinking part that requires minimal effort, while System 2 is the deliberate, slow-thinking part demanding more effort. For example, solving 2+2 falls under System 1, and 17*24 under System 2.
As per the Law of Least Effort, our brain tends to opt for the path of least resistance, with System 2 taking a backseat and only intervening when System 1 fails. However, there are situations where System 1 erroneously believes it has the answer, leading us to make errors in our judgment.
While the book talks about tons of these biases and fallacies, I am going to highlight only a few that I had noted on my phone. (No wonder I was swayed by the Law of Least effort here!)
Biases and Fallacies:
Law of Small Numbers: Small samples result in more extreme statistics, be it positive or negative. For instance, a village with a mere population of 1000 might have the highest (or lowest) per capita income in a country.
Anchoring effect: Our judgment is influenced when a specific number serves as an anchor. For instance, if we were first asked to guess if some person died before or after the age of 80, and again asked the precise number, most of our guesses would be a higher number.
Availability bias: Trusting instances that readily come to mind, like inflating the probability of plane crashes due to extensive media coverage.
Conjunctive Fallacy: Focusing on the average rather than the total. Like people undervaluing expensive gifts paired with cheap souvenirs.
Planning fallacy: Always thinking of the best-case scenario. Like it will take 3 months to create a product, 1 month to go into the market, and 2 weeks for people to go crazy over it.
Sunk-Cost fallacy: I know the product is going to suck, but since I have already invested so much time in it, I will invest more.
Hindsight fallacy: Stupid statements like “I knew it,” but only afterward in the future.
The illusion of Validity: Why experts believe they are always right, even when they are barely better than average in terms of long-term prediction.
Okay, enough about biases and fallacies. Before concluding, let me give a quick tour of our actions and memories, the focal point of the book’s final part. Similar to the two selves discussed earlier, we have two selves when it comes to experiencing and remembering our actions.
Experiencing and Remembering Selves:
Our experiencing self lives in the moment, while our remembering self dominates our evaluation of events (that is our memory). For instance, when reflecting on college experiences, our remembering self tends to focus on peak moments, often dominated by the end. (Which explains why breakups and divorce feel more painful for people.) You might ask how this is related to our topic of error and judgment. Well, more often than not, our final judgment is driven by our memory of the experience and not the actual experience. (Maybe we shouldn’t let our remembering self distort our experience of Game of Thrones because of one final horrible season.)
Conclusion:
Our system 1, which often makes our lives easier, also sometimes makes us prone to biases and fallacies. This doesn’t align with the assumption of rational behavior, often championed by economists in their theories. (Daniel Kahneman’s work in this area played a pivotal role in the creation of a new field of study, Behavioral Economics. Google more on this).
My takeaway:
While it’s difficult to detect our errors driven by system 1, we can always be careful in identifying the potential minefields and take extra caution (i.e., think slow) whenever we encounter these minefields. This is the goal of reading this wonderful and insightful book.







Leave a comment