A Thought After Reading Daniel Kahneman’s “Thinking, Fast and Slow”

  • Post author:
  • Post category:Fan
Image by Maryam62. Downloaded from pixabay.com

I can tell that Daniel Kahneman’s book, Thinking, Fast and Slow, is one of those books that I will find myself coming back to over and over again. Why? On one hand, it provides evidence from experimental psychology for phenomena that I was already inclined to believe, that is, it appeals to my own confirmation bias. At the same time, it offers a number of insights and explanations that are new to me and eye opening. Kahneman, a psychologist, won a Nobel Prize in Economics for his work on decision-making under uncertainty. His book reflects a lifetime of learning about human judgement and is an absolute delight to read.

It’s central framework seems to revolve around the idea that our minds address problems in two distinct ways, which he describes using the metaphors of System 1 and System 2. The table below summarizes some of the characteristics of each system.

System 1System 2
FastSlow
IntuitiveDeliberate
Automatic, cannot be turned off at willEffortful and/but/therefore lazy
ImpulsiveRequires self-control
Associative and causal, but does not have the capacity for statistical thinkingRequires training for statistical thinking
Prone to bias and to believingIn charge of doubting and unbelieving

Much of Kahneman’s book is then focused on System 1 and its characteristics and biases. We learn of its need for coherence and the central role of cognitive ease in driving belief. System 2 is much less explored in the book and I was left with the desire to learn more about what can be done to strengthen our use of System 2.

Why would we want to strengthen our reliance on System 2? It is not obvious that doing so would necessarily lead to better social outcomes, if we are willing to rely on expertise.System 1’s intuitive thought draws both on heuristics (shortcuts that replace a more complex problem with a simpler one) and on expertise (recognition of information previously acquired). Becoming an expert means you can draw on your acquired information to reach better conclusions with less effort. In other words, the more we rely on experts, the more likely we are to avoid incorrect conclusions in a world governed by System 1 thinkers.

However, we are unlikely (and it is probably impossible) to seek expertise in the myriad of problems and decisions we encounter on a daily basis. And as Kahneman points out in a reference to the work of another psychologist, Daniel Gilbert, System 1 is prone to believe (Kahneman, 2011, Chapter 7). “Unbelieving” is the effortful task of System 2. If we want to form opinions and draw conclusions about much of the information we encounter on any given day, we will need to spend energy, potentially a lot of energy: we will need to be willing to lead an effortful, even if potentially fulfilling life. Those less susceptible to the System 1 biases and more prone to calling on System 2 Kahneman calls “engaged.”

Should we strive to be more “engaged?” If so, how?

References

Kahneman, Daniel, 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux.

Continue ReadingA Thought After Reading Daniel Kahneman’s “Thinking, Fast and Slow”

Repetition and Belief

  • Post author:
  • Post category:Fan
Image by Pham Thoai. Downloaded from pexels.com

“You can fool all the people some of the time and some of the people all the time, but you cannot fool all the people all the time.” Respond quickly: who is the author of this statement? The most common attribution is Abraham Lincoln, although, as often happens with quotes, there is little evidence to support this attribution. There seems to be some evidence that the statement was actually made by the seventeenth century French protestant, Jacques Abbadie (Quote Investigator, 2013; Parker, 2016). However, we have heard the attribution to Abraham Lincoln so often, that we assume it to be true.

Over the end of the year holidays I read (most of) Daniel Kahneman’s book, “Thinking, Fast and Slow.” Kahneman, a psychologist, won the Nobel Prize for Economics in 2002 (shared with Vernon Smith) for the contributions to economics from his research on human judgement and decision-making under uncertainty. One aspect of human cognition that he describes in his book, is that we are more likely to believe what we find easy to. Various factors can contribute to our “cognitive ease”, including a clear display of the information we are exposed to, having been “primed” by association with a prior piece of information, being in a good mood, and also by repeated exposure to the information, whether that information is true or not (Kahneman, 2011, Chapter 5). 

Repetition is also often discussed as an important component in learning (see, for example, the literature on spaced repetition), change management and other aspects of our daily life that depend on our understanding and perception of reality. I once read (and never forgot) a passage in Machiavelli’s “The Prince” where he states that injuries should be inflicted at once, while benefits should be provided piecemeal, overtime, if a ruler is to ensure permanence in power. I always understood this as reflecting how repetition affects perception (Machiavelli, 1998, chapter VIII). 

Recent political discussions in the U.S. have referred to the idea of the “Big Lie,” the idea that less plausible lies are often easier to sell to the public than small ones… if sufficiently repeated (RationalWiki contributors, 2021). A recent paper by Fazio et al (2015) argues, based on a couple of experiments, that repetition of false information impacts belief even when those exposed know better, in an effect they call “knowledge neglect,” and that reflects a primacy of processing fluency (cognitive ease) over retrieval of knowledge under certain conditions which include repetition.

What do I take from the above? The more confident I am in new acquired knowledge, the more I will repeat, remind myself, to better internalize, garner the power of repetition. The less confident I am about new knowledge, the more suspicious I will be when seeing it repeated. I guess that is a new years resolution and, yes, those work: I heard it many times.

References

Fazio, L. K; N. M. Brashier; B. K. Payne and E. J. Marsh. 2015. “Knowledge Does Not Protect Against Illusory Truth.” In Journal of Experimental Psychology 144(5): 993-1002. Available: https://www.apa.org/pubs/journals/features/xge-0000098.pdf. Accessed: January 18, 2021

Kahneman, Daniel, 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux.

Machiavelli, Nicolo, 1998. The Prince. The Project Gutenberg eBook. Translated by W.K.Marriott. Available: https://www.gutenberg.org/files/1232/1232-h/1232-h.htm#chap08. Accessed: January 09, 2021.

Parker, David B., 2016. “You Can Fool All the People”: Did Lincoln Say It?. History News Network. Available: https://historynewsnetwork.org/article/161924. Accessed: January 09, 2021

Quote Investigator, 2013. You cannot fool all the people all the time. Available: https://quoteinvestigator.com/2013/12/11/cannot-fool/#return-note-7793-2. Accessed: January 09, 2021.

RationalWiki contributors, 2021. “Big Lie,” In RationalWiki. Available: https://rationalwiki.org/wiki/Big_lie. Accessed: January 09, 2021.

Continue ReadingRepetition and Belief

End of content

No more pages to load