Our Brains and Decision Making

  • Post author:
  • Post category:Fan
Image by Gordon Johnson. Downloaded from pixabay.com

I recently read David Eagleman’s book “The Brain. A Story of You.” I also watched the associated PBS documentary. The book and documentary follow each other very closely and the documentary allows you to see some of the people and experiments referred to in the book. A couple of points made by David Eagleman made me think again about the common use of the terms “data-driven decision making” and “data-informed decision making,” the extent to which our decisions are made based on evidence presented to us, and what evidence exactly do we make decisions based on. 

I will leave a more detailed review of the use of the terms “data-driven,” “data-informed” or “evidence-based,” for another post. But these terms are typically used without a recognition of how much we impose on our analysis of data our prior beliefs and assumptions. From the moment we ask ourselves a question, we are choosing what interests us. When we decide what data we need to look at, we are making assumptions about what data matters, based on our experience, reasoning and assumed knowledge of the world. When we actually obtain data, our analysis is now constrained by data availability and what they represent: how the data were defined and collected. The recognition of this dependency on prior beliefs and “learned” experiences should make the use of the term “data-driven” highly problematic. But it should also make us question what exactly we mean by “data-informed” or “evidence-based” decision making. What evidence exactly are we talking about and how exactly are we using it?

With this in mind, I found a couple of points made by Eagleman to be illuminating.

One of the points is that decisions often (perhaps most often) require connecting the analytical parts/networks of the brain to the emotional ones. Without the connections to our emotions, we are often unable to make decisions. The book provides a couple of examples, such as a woman who, due to a motorcycle accident, had these brain connections weakened and found herself unable to make daily decisions, such as what to wear, eat or to do during the day. Another example was an experiment where decisions were reverted when emotional factors were brought into play even though the choices were, analytically speaking, unaltered. The insight is that choices often involve many factors offering trade-offs and that our logical brain cannot often assign values to those trade-offs to make a decision. The values are assigned based on bodily/emotional signatures built from past experience. Without those, decisions are often not possible. These signals are often embodied in the release or suppression of hormones affecting transmission of stimuli between neurons. Hormones such as dopamine or oxytocin. As we acquire new experiences, the stimuli that these neurotransmitters produce in our brains are often adjusted based on the confirmation or frustration of past experiences (differences between expectations and reality). That is how we learn.

Another point made in Eagleman’s book is that our brains are primed for social interaction. We are wired to see social intention where it does not exist. This is exemplified by an experiment where a short film of geometrical objects moving around a screen tends to induce subjects to interpret the movements as if telling a story, where the objects would move intentionally as if they represented humans or animals. Further, in the same way as we tend to humanize objects, we also sometimes dehumanize other people, presumably when seeing them as humans creates a burden we consider too much to bear (e.g. experiments show this often happens when we are faced with the homeless).

The first point means that, we typically will not make decisions based on data or evidence presented to us alone. No matter how much we may want to make data and evidence based decisions, when factoring options we will likely bring to bear, consciously or unconsciously, our lifetime experiences, transmitted to our brain through chemical stimuli.

The second point means that, in interpreting events, occurrences, phenomena of all kinds, from social phenomena to purely physical ones, we tend to attribute intentionality to those events, we tend to attribute human characteristics to phenomena that may not have it or not be able to be reduced to such. Eagleman sees in this tendency, evidence of the importance of human interactions for our brains and for who we are. But it can also be seen as a potential factor in our tendency to see organizations, firms, governments behaving as if they were individual decision-making units rather than composed of people themselves. Perhaps this attribution of human intentionality to anything but a person could help explain conspiratory theories, where large networks are assumed to work in unison towards a common goal; and perhaps it could help explain situations where we see cause and effect where there is none, simply because we attributed agency to entities that do not have it.

Both points made by Eagleman, the role of emotions in decision-making and our tendency to attribute human intentionality to entities other than a person, should make us question the extent to which our minds are pre-conditioned to make decisions largely based on factors beyond the data and evidence put in front of us on any given decision-making occasion. They should make us think of ways to build into decision-making processes awareness of how our brain works and the possible implications for the decisions we end up making.

References

Eagleman, David. 2017. The Brain. The Story of You. Vintage Books.

Continue ReadingOur Brains and Decision Making

A Thought After Reading Daniel Kahneman’s “Thinking, Fast and Slow”

  • Post author:
  • Post category:Fan
Image by Maryam62. Downloaded from pixabay.com

I can tell that Daniel Kahneman’s book, Thinking, Fast and Slow, is one of those books that I will find myself coming back to over and over again. Why? On one hand, it provides evidence from experimental psychology for phenomena that I was already inclined to believe, that is, it appeals to my own confirmation bias. At the same time, it offers a number of insights and explanations that are new to me and eye opening. Kahneman, a psychologist, won a Nobel Prize in Economics for his work on decision-making under uncertainty. His book reflects a lifetime of learning about human judgement and is an absolute delight to read.

It’s central framework seems to revolve around the idea that our minds address problems in two distinct ways, which he describes using the metaphors of System 1 and System 2. The table below summarizes some of the characteristics of each system.

System 1System 2
FastSlow
IntuitiveDeliberate
Automatic, cannot be turned off at willEffortful and/but/therefore lazy
ImpulsiveRequires self-control
Associative and causal, but does not have the capacity for statistical thinkingRequires training for statistical thinking
Prone to bias and to believingIn charge of doubting and unbelieving

Much of Kahneman’s book is then focused on System 1 and its characteristics and biases. We learn of its need for coherence and the central role of cognitive ease in driving belief. System 2 is much less explored in the book and I was left with the desire to learn more about what can be done to strengthen our use of System 2.

Why would we want to strengthen our reliance on System 2? It is not obvious that doing so would necessarily lead to better social outcomes, if we are willing to rely on expertise.System 1’s intuitive thought draws both on heuristics (shortcuts that replace a more complex problem with a simpler one) and on expertise (recognition of information previously acquired). Becoming an expert means you can draw on your acquired information to reach better conclusions with less effort. In other words, the more we rely on experts, the more likely we are to avoid incorrect conclusions in a world governed by System 1 thinkers.

However, we are unlikely (and it is probably impossible) to seek expertise in the myriad of problems and decisions we encounter on a daily basis. And as Kahneman points out in a reference to the work of another psychologist, Daniel Gilbert, System 1 is prone to believe (Kahneman, 2011, Chapter 7). “Unbelieving” is the effortful task of System 2. If we want to form opinions and draw conclusions about much of the information we encounter on any given day, we will need to spend energy, potentially a lot of energy: we will need to be willing to lead an effortful, even if potentially fulfilling life. Those less susceptible to the System 1 biases and more prone to calling on System 2 Kahneman calls “engaged.”

Should we strive to be more “engaged?” If so, how?

References

Kahneman, Daniel, 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux.

Continue ReadingA Thought After Reading Daniel Kahneman’s “Thinking, Fast and Slow”

Repetition and Belief

  • Post author:
  • Post category:Fan
Image by Pham Thoai. Downloaded from pexels.com

“You can fool all the people some of the time and some of the people all the time, but you cannot fool all the people all the time.” Respond quickly: who is the author of this statement? The most common attribution is Abraham Lincoln, although, as often happens with quotes, there is little evidence to support this attribution. There seems to be some evidence that the statement was actually made by the seventeenth century French protestant, Jacques Abbadie (Quote Investigator, 2013; Parker, 2016). However, we have heard the attribution to Abraham Lincoln so often, that we assume it to be true.

Over the end of the year holidays I read (most of) Daniel Kahneman’s book, “Thinking, Fast and Slow.” Kahneman, a psychologist, won the Nobel Prize for Economics in 2002 (shared with Vernon Smith) for the contributions to economics from his research on human judgement and decision-making under uncertainty. One aspect of human cognition that he describes in his book, is that we are more likely to believe what we find easy to. Various factors can contribute to our “cognitive ease”, including a clear display of the information we are exposed to, having been “primed” by association with a prior piece of information, being in a good mood, and also by repeated exposure to the information, whether that information is true or not (Kahneman, 2011, Chapter 5). 

Repetition is also often discussed as an important component in learning (see, for example, the literature on spaced repetition), change management and other aspects of our daily life that depend on our understanding and perception of reality. I once read (and never forgot) a passage in Machiavelli’s “The Prince” where he states that injuries should be inflicted at once, while benefits should be provided piecemeal, overtime, if a ruler is to ensure permanence in power. I always understood this as reflecting how repetition affects perception (Machiavelli, 1998, chapter VIII). 

Recent political discussions in the U.S. have referred to the idea of the “Big Lie,” the idea that less plausible lies are often easier to sell to the public than small ones… if sufficiently repeated (RationalWiki contributors, 2021). A recent paper by Fazio et al (2015) argues, based on a couple of experiments, that repetition of false information impacts belief even when those exposed know better, in an effect they call “knowledge neglect,” and that reflects a primacy of processing fluency (cognitive ease) over retrieval of knowledge under certain conditions which include repetition.

What do I take from the above? The more confident I am in new acquired knowledge, the more I will repeat, remind myself, to better internalize, garner the power of repetition. The less confident I am about new knowledge, the more suspicious I will be when seeing it repeated. I guess that is a new years resolution and, yes, those work: I heard it many times.

References

Fazio, L. K; N. M. Brashier; B. K. Payne and E. J. Marsh. 2015. “Knowledge Does Not Protect Against Illusory Truth.” In Journal of Experimental Psychology 144(5): 993-1002. Available: https://www.apa.org/pubs/journals/features/xge-0000098.pdf. Accessed: January 18, 2021

Kahneman, Daniel, 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux.

Machiavelli, Nicolo, 1998. The Prince. The Project Gutenberg eBook. Translated by W.K.Marriott. Available: https://www.gutenberg.org/files/1232/1232-h/1232-h.htm#chap08. Accessed: January 09, 2021.

Parker, David B., 2016. “You Can Fool All the People”: Did Lincoln Say It?. History News Network. Available: https://historynewsnetwork.org/article/161924. Accessed: January 09, 2021

Quote Investigator, 2013. You cannot fool all the people all the time. Available: https://quoteinvestigator.com/2013/12/11/cannot-fool/#return-note-7793-2. Accessed: January 09, 2021.

RationalWiki contributors, 2021. “Big Lie,” In RationalWiki. Available: https://rationalwiki.org/wiki/Big_lie. Accessed: January 09, 2021.

Continue ReadingRepetition and Belief

End of content

No more pages to load