Bias: A Tale of Two Systems. Thinking Fast and Slow

Bias: A Tale of Two Systems. Thinking Fast and Slow

Shortly before the pandemic hit, I was asked by an organization to conduct training on bias for their staff.  I suspect they had something else in mind when they asked me to put together this training, but I ended up putting together a training that touched upon several different types of bias that affect our everyday thinking and lives. Over a series of posts, I am going to touch on some common cognitive biases from that training.  

I used the framework of thinking fast and slow to help illustrate biases we all fall victim to each and every day. 

The idea of our brains having a two-system way of thinking was popularized in Thinking Fast and Slow, a book on behavioral psychology and decision-making by Daniel Kahneman.  This book contains some profoundly important concepts around how people make decisions. Conceptualizing our brains as using two different systems of thinking can help us understand why we as humans sometimes make errors in judgement, and how to look for signs that we may be about to make such an error.

Kahneman talks about System 1 and System 2 Thinking in his book.  System 1 is the intuitive, “gut reaction” way of thinking and making decisions.  System 1 forms “first impressions” and often is the reason why we jump to conclusions.  This is the Thinking Fast system.

System 2 is the analytical, “critical thinking” way of making decisions.  System 2 does reflection, problem-solving, and analysis.  This is the Thinking Slow system.

Do you think of yourself as more of a System 1 or System 2 thinker?  In which system do you think you spend more of your time?  Do you tend to be more deliberate, careful, and slow (System 2) in your thinking or are you a quick, intuitive, gut feeling kind of thinker (System 1)?

Most people when asked this question, identify with System 2 thinking. We consider ourselves rational, analytical human beings. Thus, we think we spend most of our time engaged in System 2 thinking.  Actually, according to Kahneman, we spend almost all of our daily lives engaged in System 1 (Thinking Fast). Only if we encounter something unexpected, or if we make conscious effort, do we engage System 2 (Thinking Slow).

Both systems of thinking have value but are designed for different purposes.  To survive physically or psychologically, we sometimes need to react automatically to a speeding taxi as we step off the curb or to the subtle facial cues of an angry boss.  That automatic mode of thinking, not under voluntary control, contrasts with the need to slow down and deliberately fiddle with pencil and paper when working through an algebra problem.  These two systems that the brain uses to process information are both important but can sometimes be high jacked. 

According to Kahneman, Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in comfortable low-effort mode, in which only a fraction of its capacity is engaged.  System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings.  If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions.  When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine — usually.

When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment.  System 2 is mobilized when a question arises for which System 1 does not offer an answer.  System 2 is activated when an event is detected that violates the model of the world that System 1 maintains.  So, System 1 is continuously creating impressions, intuitions, and judgments based on everything we are sensing.  In most cases, we just go with the impression or intuition that System 1 generates.  System 2 only gets involved when we encounter something unexpected that System 1 can’t automatically process.

Sometimes this interplay among the systems and how they work can introduce opportunities for errors.  System 1 thinking seeks a coherent story above all else, and often leads us to jump to conclusions.  While System 1 is generally very accurate, there are situations where it can make errors of bias.  System 1 sometimes answers easier questions than it was asked, and it has little knowledge of logic and statistics.

System 1 can serve as a source of errors and bias as a function of how it operates.  One of the biggest problems with System 1 is that it seeks to quickly create a coherent, plausible story — an explanation for what is happening — by relying on associations and memories, pattern-matching, and assumptions.  And System 1 will default to that plausible, convenient story — even if that story is based on incorrect information.

Kahneman said this of the error proneness of System 1 thinking: “The measure of success for System 1 is the coherence of the story it manages to create.  The amount and quality of the data on which the story is based is largely irrelevant.  When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.”

WYSIATTI  What You See Is All There Is.  WYSIATTI causes us to “focus on existing evidence and ignore absent evidence.”  As a result of WYSIATTI, System 1 often quickly creates a coherent and believable story based on limited evidence.  These impressions and intuitions can then be endorsed by System 2 and turn into deep-rooted values and beliefs.  WYSIATTI can cause System 1 to “infer and invent causes and intentions,” whether or not those causes or intentions are true.

System 1 is highly adept in one form of thinking — it automatically and effortlessly identifies causal connections between events, sometimes even when the connection is spurious.  This is the reason why people jump to conclusions, assume bad intentions, give in to prejudices or biases, and buy into conspiracy theories.  They focus on limited available evidence and do not consider absent evidence.  They invent a coherent story, causal relationships, or underlying intentions.  And then their System 1 quickly forms a judgment or impression, which in turn gets quickly endorsed by System 2.

As a result of WYSIATTI and System 1 thinking, people may make wrong judgments and decisions due to cognitive errors.  If we had to think through every possible scenario for every possible decision, we probably wouldn’t get much done in a day.  In order to make decisions quickly and economically, our brains rely on a number of cognitive shortcuts known as heuristics. These mental rules-of-thumb allow us to make judgments quite quickly and often quite accurately, but they can also lead to fuzzy thinking and poor decisions.

As a result of WYSIATTI and System 1 thinking, people may make wrong judgments and decisions due to biases and heuristics.  There are several potential errors in judgment that people may make when they over-rely on System 1 thinking.  Perhaps one of my favorites (and one you some quite commonly) is Spurious or Illusory Correlations.  If you ever had a statistics course, you may remember the phrase “Correlation is not Causation”. 

An illusory or spurious correlation is the phenomenon of perceiving a relationship between variables (typically people, events, or behaviors) even when no such relationship exists.  A false association may be formed because rare or novel occurrences are more salient and therefore tend to capture one’s attention.  This phenomenon is one way stereotypes form and endure.

There are some classic examples of this WYSIATTI thinking on a site called Spurious Correlations https://www.tylervigen.com/spurious-correlations For example, I bet you didn’t know the following:

*As the per capita consumption of margarine in Maine increases, the divorce rate in Main also increases.

*As the rate of marriage increases in Kentucky, the number of people who drown from falling out of fishing boats in Kentucky also increases.  (Insert your own joke here).

*The number of films Nicolas Cage appears in during a year impacts the number of airport security screeners hired in North Dakota.  The more films Nicolas Cage appears in, the more airport screeners there are in North Dakota.

These are silly examples that we would hope nobody would actually believe but illustrate the point (and potential danger) of these built in biases of our thinking systems.  The site Spurious Correlations actually has an AI generated research paper on the margarine and divorce rate correlation.  You can view it here   https://www.tylervigen.com/spurious/research-papers/5920_spreading-love-and-margarine-an-examination-of-the-butter-splitter-correlation-in-maine.pdf

I will touch more on Illusory Correlations and how to safeguard yourself from them (as well as other thinking biases) in future posts. 

Leave a comment

Blog at WordPress.com.

Up ↑