Back Home

  • Search Search Search …
  • Search Search …

What is Hindsight Bias: A Comprehensive Analysis

What is Hindsight Bias

Hindsight bias is a psychological phenomenon that occurs when people believe they knew the outcome of an event before it happened. It is often referred to as the “knew-it-all-along” phenomenon, and is a common cognitive bias that affects our perception of past events. The occurrence of hindsight bias can impact various aspects of our lives, including decision-making, and cause us to overestimate our own predictive abilities.

In psychology, hindsight bias is considered a type of confirmation bias, which influences our memory of past events and leads to distorted thinking. It is crucial to understand the concept of hindsight bias to recognize its effects on our judgments and to improve our decision-making processes.

Key Takeaways

  • Hindsight bias refers to the tendency to believe that one could have predicted an event’s outcome after it has occurred
  • It is a common cognitive bias in psychology and can have significant implications on decision-making processes
  • To counteract hindsight bias, it is essential to acknowledge its presence and strive for more objective evaluations of past events

Understanding Hindsight Bias

Hindsight bias is a cognitive phenomenon that causes people to believe they could have accurately predicted an event’s outcome after it has already occurred. This type of bias falls under the category of cognitive biases, which are systematic errors in thinking and perception that affect decision-making.

The hindsight bias can make events seem more predictable than they truly were, leading to overconfidence in one’s ability to foresee outcomes. As a result, individuals may assume they knew the outcome of an event before it actually happened. This phenomenon is also known as the “knew it all along” effect.

An important aspect of hindsight bias is its impact on how people interpret and learn from past events. When an outcome appears predictable in hindsight, individuals may underestimate the complexity of the situation or overlook critical information that was not apparent at the time. This can lead to overgeneralization of lessons learned and a false sense of confidence in future predictions.

Several factors can influence the magnitude of hindsight bias . One such factor is the availability of information – when people have more access to information about an event after it has occurred, hindsight bias is more likely to occur. Another contributing factor is the desire for consistency, as individuals may find it uncomfortable to acknowledge their initial predictions were incorrect.

To mitigate hindsight bias, it is essential to recognize its presence and actively consider alternative explanations and scenarios. Documenting predictions and their justifications before an event can provide a more accurate basis for evaluation when reviewing the outcome. Furthermore, considering multiple perspectives and seeking feedback from others can help counteract biases and improve decision-making.

Hindsight bias is an important concept to understand, as it can affect how individuals evaluate their own and others’ decision-making abilities. Recognizing this cognitive bias and taking steps to reduce its impact can lead to more accurate understanding of events and better-informed decisions in the future.

Hindsight Bias Examples

Hindsight bias is a cognitive bias that leads people to believe, after an event has happened, that they knew or predicted the outcome ahead of time. This can happen in various aspects of life, such as sports, politics, and decision-making.

For instance, in the world of sports, a common example is when a fan claims, after a game, that they knew their favorite team would lose the match even though they did not feel that way before the game. This type of thinking can skew their analysis of the game and their evaluation of the team’s future performance.

Another area where hindsight bias appears is in politics and elections. After an election result is declared, voters may claim they knew a particular candidate would win or lose based on their perception of the political climate before the election. This can lead to a false sense of understanding about the factors that actually influenced the election outcome.

In investment and finance, hindsight bias can lead individuals to overestimate their ability to predict market trends and make successful investment decisions. After a stock does well, an investor might believe they knew it would perform strongly, while they may not have had such confidence at the time of investment. This feeling of knowing the outcome ahead of time can cause overconfidence in their decision-making process , potentially leading to poor future investment choices.

Within the context of accidents and tragedies, hindsight bias can also contribute to assigning blame inappropriately. For example, after a car accident, people may claim that they predicted the crash was inevitable due to the driving conditions or the behavior of the driver. However, in reality, they may not have perceived any risk before the accident occurred.

In summary, hindsight bias can manifest in a variety of situations, often prompting individuals to assume they had prior knowledge or insight into an event’s outcome. This cognitive distortion can impact decision-making, assessments, and judgments in different areas of life, altering people’s perceptions of reality and influencing their future decisions.

The Psychology behind Hindsight Bias

The confirmation bias connection.

Hindsight bias is a psychological phenomenon that is closely related to confirmation bias . Confirmation bias is the tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs. In the context of hindsight bias, this means that people often remember their predictions about an event’s outcome as being more accurate than they actually were. This occurs because individuals tend to selectively recall information that supports their beliefs and dismiss information that contradicts them.

Overconfidence and Decision-Making

A major consequence of hindsight bias is overconfidence in one’s ability to predict future events and make accurate decisions. This overconfidence can lead to poor decision-making, as people may underestimate the risks associated with certain choices, assuming that they have a better understanding of the situation than they actually do. As people overestimate their ability to foresee the past, they may also overestimate their ability to predict future outcomes and manage complex situations.

Memory Distortion and Perception

Hindsight bias also involves memory distortion , which occurs when people unconsciously alter their memories of past events to fit their current knowledge and beliefs. This is a natural coping mechanism that helps individuals make sense of their experiences and maintain a consistent self-concept. However, this distortion can lead to biased perceptions of reality and ultimately impact the way people understand past events and make decisions.

In summary, hindsight bias is a psychological phenomenon that can influence beliefs, decision-making, and perceptions of past events. It stems from confirmation bias and is closely tied to overconfidence, memory distortion, and perception. Understanding this bias is essential for recognizing its impact on thought processes and navigating complex situations effectively.

Hindsight Bias across Different Ages

Hindsight bias is the tendency to perceive past events as more predictable than they actually were, leading people to believe that their judgment is better than it is. This bias can result in unnecessary risks or overly harsh judgments of others. It is colloquially known as the “I knew it all along” phenomenon and is a type of confirmation bias .

Research has shown that hindsight bias can be present in individuals of various ages. A study involving participants aged 3 to 95 years old found that hindsight bias can be observed across the entire age range. However, certain factors, such as inhibitory control, may contribute to age-related differences in hindsight bias.

For example, a life-span study focusing on cognitive processes underlying hindsight bias identified that age-related differences may be based on how well individuals can inhibit their hindsight bias. The study, which involved 9-year-olds, 12-year-olds, young adults, and older adults, found that inhibitory control plays a role in the ability to counteract hindsight bias.

In a college setting, hindsight bias can have implications on students’ decision-making and learning processes. College students, who are often in their late teens or early twenties, might be particularly vulnerable to hindsight bias due to their developing inhibition skills. This can lead to overconfidence in their judgments and a decreased ability to learn from their mistakes.

In conclusion, hindsight bias can be present across various age groups, from young children to older adults. Factors such as inhibitory control can help explain the differences in hindsight bias across different ages. College students, in particular, should be aware of the potential pitfalls of hindsight bias in their decision-making and learning processes.

Hindsight Bias in Electoral Procedures

Hindsight bias is a cognitive phenomenon where people overestimate their ability to predict outcomes after knowing the outcome. In the context of electoral procedures, this bias can affect our perceptions of election results, political decisions, and even the appointment of Supreme Court nominees. This section will discuss the impact of hindsight bias on these aspects in a confident, knowledgeable, neutral, and clear manner.

During election periods, voters often express strong opinions about who will win a race, such as the U.S. Senate elections. When the results are announced, some individuals may claim that they “knew it all along,” even though, in reality, the outcome was not as predictable as they thought. This occurrence is an example of hindsight bias playing a role in electoral procedures.

Moreover, hindsight bias can also influence our evaluations of political decisions. For instance, if a policy or decision were to have negative consequences, people might be quick to criticize those involved and insist they should have seen it coming. However, this appraisal may be a result of hindsight bias and not a fair assessment of the actual predictability of the event. An example from the search results that emphasize hindsight-biased evaluation of political decision-makers can be referred to for more information.

Finally, the appointment of Supreme Court nominees may also be subject to the effects of hindsight bias. When a nominee’s decisions and opinions are scrutinized after their confirmation, individuals may claim that they anticipated certain rulings or legal positions. But, again, hindsight bias may be affecting their perception of the actual predictability of such outcomes.

In summary, hindsight bias can influence our understanding and evaluation of various aspects of electoral procedures, including election results, political decisions, and Supreme Court appointments. Being aware of this bias can help in approaching these matters with a more balanced perspective and avoiding the pitfalls of overconfidence in hindsight.

Financial Decision Making and Hindsight Bias

Hindsight bias is a type of cognitive bias that causes people to convince themselves that a past event was predictable or inevitable after it has already occurred. In the context of financial decision making, hindsight bias can lead individuals to overestimate their ability to predict investment outcomes and manage risks effectively.

When investing, there is often uncertainty and numerous factors that can influence the success or failure of a financial decision. Hindsight bias can cause investors to believe that they should have foreseen certain market movements or risks, leading them to feel overconfident in their abilities and potentially making riskier investments based on this false perception of their predictive skills.

One effective way to manage hindsight bias in financial decision making is to maintain a decision journal , which serves as a tool for documenting the decision-making process during investment decisions. By recording investment rationales, expected outcomes, and risks, a decision journal can provide important context and perspective for future evaluations of investment decisions.

Utilizing a decision journal can also help investors reflect on their decision-making processes and identify areas for improvement. Regularly reviewing the journal can lead to more accurate assessments of one’s skills, helping to foster a healthy balance of confidence and humility in financial decision making.

In summary, hindsight bias can distort the perceptions of investors when evaluating their financial decisions, causing them to overestimate their predictive abilities and potentially take on greater risks. To overcome this cognitive bias, using techniques such as maintaining a decision journal and engaging in regular self-reflection can be valuable tools to foster more accurate assessments and promote informed financial decision making.

The Availability Heuristic and Representativeness Heuristic

Heuristics are mental shortcuts that help individuals make quick and efficient decisions. Two common types of heuristics are the availability heuristic and the representativeness heuristic. Both play a role in shaping an individual’s perception of the likelihood of events, even though they operate differently.

The availability heuristic influences decision-making by allowing individuals to estimate probability based on how easily examples or instances come to mind. In other words, the more readily an event can be recalled, the more likely it seems to occur. However, this heuristic can lead to biases as memorable or recent events might not be representative of the overall situation.

For example, after hearing about a series of plane crashes on the news, one might overestimate the likelihood of being involved in a plane crash. This overestimation occurs because the vivid and recent memories of plane crashes are more readily available, ignoring the fact that statistically, air travel is safer than other modes of transportation.

On the other hand, the representativeness heuristic involves making decisions based on how similar an event or object is to an existing mental prototype. People tend to believe that events that closely resemble their mental prototype have a higher probability of occurring, which can sometimes lead to biases and incorrect judgments.

An example of the representativeness heuristic in action would be assuming that a person who is quiet and introverted is more likely to be a librarian than a salesperson. This assumption is based on the stereotype that librarians are typically introverted, while salespeople are usually more outgoing. In reality, quiet and introverted individuals can be found in both professions, and making a decision based on this stereotype could lead to inaccurate predictions.

In conclusion, both the availability heuristic and the representativeness heuristic are important factors in shaping individuals’ decisions and perceptions of probability. Becoming aware of these cognitive shortcuts and their potential biases can help people make more informed decisions and avoid misjudgments.

Studies on Hindsight Bias

Various studies and experiments have been conducted to understand the effects and implications of hindsight bias in different scenarios. One of the most well-known studies is the Fischhoff experiment , where participants were asked to predict outcomes of events, and later, after knowing the actual outcomes, were asked to recall their earlier predictions. The results demonstrated that people tend to overestimate their ability to predict outcomes, indicating the presence of hindsight bias.

In the field of psychology, hindsight bias has been researched to study its effects in decision-making and behavior. Researchers have found that hindsight bias can lead to an overconfidence in one’s ability to predict and an underestimation of the unpredictable nature of events. A study in SAGE Journals by Neal J. Roese and Kathleen D. Vohs sheds light on the cognitive mechanisms underlying hindsight bias and its consequences in various domains, such as medicine, law, and sports.

Moreover, a recent study published in Current Psychology explored the role of perspective-taking in hindsight bias. The authors, in this study, focused on how perspective-taking changes the hindsight bias when the target of prediction is oneself or a peer. Here, researchers discovered that perspective-taking reduced hindsight bias to a certain extent, highlighting the importance of considering others’ perspectives in mitigating hindsight bias.

Additionally, hindsight bias is also seen to impact financial decision-making, where investors may believe they could have predicted the market trends retrospectively. Such biases affect their ability to make accurate financial decisions, as they develop a false sense of confidence in their predictive capabilities.

These studies collectively offer valuable insight into how hindsight bias can alter our perception of past events and potentially influence our future decision-making. By understanding this cognitive bias, researchers and practitioners can develop strategies to minimize its effects and improve decision-making processes.

Counteracting Hindsight Bias

The role of metacognitive awareness.

One way to counteract hindsight bias is by enhancing metacognitive awareness . Metacognition refers to a person’s ability to reflect on their own thought processes, evaluating the accuracy of their judgments and the reliability of information sources. People who are more metacognitively aware are less likely to assume that they “knew it all along” when presented with new information. To improve metacognitive awareness, individuals can practice:

  • Actively questioning their beliefs and predictions
  • Seeking diverse perspectives and opinions
  • Engaging in critical thinking and self-reflection

Adaptive Learning and Uncertainty

Embracing adaptive learning and uncertainty can be another effective strategy for counteracting hindsight bias. Adaptive learning involves updating one’s beliefs and knowledge based on new information and experiences. When individuals are open to learning from their changing environments and accept that future events are often uncertain, they are less likely to retrospectively overestimate their ability to predict outcomes. To cultivate adaptive learning:

  • Recognize that new information can alter the reliability of past judgments
  • Accept that uncertainty is an inherent aspect of many situations
  • Emphasize continuous learning and growth over maintaining the illusion of control

Use of Anchoring Technique

An additional method for reducing hindsight bias is the anchoring technique . Anchoring involves setting a reference point or starting point for an estimate or prediction, then adjusting that initial value based on subsequent information. By establishing an anchor, individuals can more easily track how their opinions or predictions change over time, providing a concrete basis for comparison. To effectively implement the anchoring technique:

  • Establish a clear starting point for estimations or predictions
  • Document any changes or adjustments made in response to new information
  • Regularly review the anchor to maintain a more objective perspective on the development of judgments

In conclusion, hindsight bias can affect the way people perceive events and make decisions. By developing metacognitive awareness, embracing adaptive learning, and using anchoring techniques, individuals can work towards counteracting this cognitive bias and improving their decision-making processes.

Hindsight Bias and Emotional Consequences

Guilt and ptsd after trauma.

Hindsight bias can have significant emotional consequences for individuals who have experienced traumatic events, such as natural disasters, accidents or sexual assault . This cognitive distortion may lead survivors to believe that they could and should have predicted or prevented the event from occurring. As a result, individuals may experience feelings of guilt and self-blame, which can contribute to the development of post-traumatic stress disorder (PTSD).

For example, some survivors may focus on specific decisions they made before the traumatic event, such as their choice of location or behavior, and perceive them as foolish in hindsight. This can lead to rumination and negative thought patterns as they blame themselves for something they could not have truly predicted. It is essential for trauma survivors to recognize hindsight bias as a common cognitive distortion and practice compassion towards themselves, letting go of self-blame and embracing self-care.

Depression and Regret in Relationships

In the context of personal relationships, hindsight bias can also play a detrimental role by intensifying feelings of regret and sadness. When individuals analyze past disputes, breakups, or failed relationships, hindsight bias may lead them to view certain decisions or actions as obvious mistakes, even if they were not predictable or preventable at the time.

This distortion can create a sense of helplessness and despair, potentially contributing to the onset or worsening of depression . Moreover, hindsight bias can impede personal growth and acceptance, as individuals may become fixated on correcting perceived past mistakes instead of focusing on learning from their experiences and moving forward.

To mitigate the emotional consequences of hindsight bias in relationships, cultivating self-awareness and self-compassion is crucial. Acknowledging that past events were not as predictable as they may appear in retrospect can help individuals forgive themselves and others, fostering healthier and more resilient emotional responses to life’s challenges.

Hindsight Bias in Behavioral Economics

Hindsight bias, often referred to as the “knew-it-all-along” effect, is a cognitive bias that causes individuals to see past events as more predictable than they were at that time. This bias can influence decision-making processes, leading to overconfidence in one’s ability to predict outcomes.

In the context of behavioral economics, hindsight bias has significant implications for understanding risk-taking and decision-making. When individuals perceive past events as foreseeable, they may also feel more confident about predicting the outcomes of future events. This increased confidence can lead individuals to make riskier decisions because they incorrectly believe they have a better understanding of the potential consequences.

Moreover, hindsight bias affects how people evaluate the decision-making abilities of others. For example, if someone makes a bad investment, others might judge them as incompetent or foolish because they believe the poor outcome was foreseeable. In reality, predicting the stock market or other complex systems is challenging and often impossible. Overconfidence resulting from hindsight bias can lead investors to make errors in their judgment and assessments of perceived risks.

Due to the pervasiveness of hindsight bias, it is essential to recognize and mitigate its effects. In behavioral economics , this bias may contribute to irrational behaviors or suboptimal decisions. Identifying this cognitive error and implementing strategies such as considering alternative outcomes and challenging one’s assumptions can lead to more accurate assessments and better decision-making.

In summary, hindsight bias is a common cognitive phenomenon that significantly impacts decision-making processes in behavioral economics. By recognizing this bias and employing strategies to counteract its effects, individuals can make better-informed decisions and mitigate the impact of risky behaviors.

Hindsight Bias in Legal Outcomes

Hindsight bias, also known as the I knew it all along phenomenon , is a cognitive bias that causes people to overestimate their ability to have foreseen an event’s outcome after learning about it. This bias can significantly impact legal outcomes, particularly in cases involving victims, foreseeability, malpractice lawsuits, and accidents.

In legal proceedings, the hindsight bias may affect the way jurors and judges perceive the actions of those involved in an accident or a malpractice lawsuit. For example, when a victim is injured in an accident, hindsight bias might lead the jury to believe that the responsible party should have been able to foresee and prevent the accident. This can lead to unfair judgments and disproportionate penalties for the accused.

Similarly, in malpractice lawsuits, hindsight bias can have a significant influence on the evaluation of the defendant’s actions. For instance, a physician might have made a decision based on the information available at the time, but when presented with the negative outcome in court, jurors or judges may believe that the medical professional should have known better. This may result in undue blame and liability on the professional, despite them acting in good faith and with reasonable care.

To mitigate the impact of hindsight bias in legal contexts, attorneys and expert witnesses should try to provide objective evidence and reconstruct the original circumstances surrounding the event. By exposing jurors and judges to the conditions and limitations present at the time, they may perceive the actions of the parties involved more accurately, leading to fairer outcomes.

Additionally, the legal system must be aware of the potential pitfalls of hindsight bias when assessing foreseeability in questions of negligence. Just because an outcome seems obvious in hindsight does not mean it was foreseeable in the actual context, and this should be carefully considered by all parties involved in the legal process.

In conclusion, hindsight bias has the potential to significantly affect legal outcomes, leading to potentially unjust consequences for victims, defendants, and professionals involved in accidents and malpractice lawsuits. By being aware of and addressing this cognitive bias, the legal system can strive towards more equitable outcomes for all parties involved.

Cultural and Societal Implications of Hindsight Bias

Hindsight bias has significant cultural and societal implications, which can manifest in various domains, such as judging the inevitability of specific events, influencing the outcomes in movies, affecting criminal investigations, and altering our intuition about potential outcomes.

One of the primary implications of hindsight bias is the sense of inevitability, where people often perceive past events as more predictable than they were. This phenomenon can lead to an oversimplification of complex situations and negatively impact learning and decision-making, hindering our ability to effectively analyze similar situations in the future.

In the world of movies and entertainment, hindsight bias can influence how creators construct narratives and how audiences interpret them. For example, when crafting a suspense thriller, screenwriters and directors might create implausible scenarios, knowing that hindsight bias will cause viewers to find them more believable after the fact. This can create a fulfilling cinematic experience where the audience feels like they should have seen the twist or reveal coming but didn’t.

Hindsight bias can also affect criminal investigations and legal proceedings. Investigators may fall prey to the bias, assuming that because a crime seems predictable in hindsight, the perpetrator’s guilt is more apparent than it may be. This can result in wrongful convictions or oversights in investigations. Juries can be susceptible to hindsight bias too, leading them to believe that the defendant should have foreseen the consequences of their actions when, in reality, things were less clear at the time.

When examining potential outcomes, hindsight bias can lead individuals to overestimate their own intuition and predictive abilities. This tendency can be misleading as people might fail to explore alternative outcomes or underestimate the complexity of future events. Moreover, it may prompt false confidence, which can hinder effective decision-making and lead to unfavorable outcomes.

In conclusion, hindsight bias has notable implications for various aspects of our lives, from cultural narratives to societal decision-making. By understanding and addressing this bias, individuals and society can develop more accurate perceptions and make better-informed choices.

Frequently Asked Questions

In which fields is hindsight bias most prevalent.

Hindsight bias is most prevalent in fields where decision-making and analysis of past events play critical roles. This can include areas such as psychology, finance, sports, politics, and medicine. People in these fields often look back on events and perceive them as more predictable than they actually were, leading to overconfidence and misguided decisions in the future.

How does hindsight bias affect decision-making?

Hindsight bias affects decision-making by causing individuals to overestimate their ability to predict outcomes and make accurate decisions. This overconfidence can lead to poor choices, as people may place too much trust in their judgments without considering alternate possibilities or seeking input from others. Additionally, hindsight bias can create an illusion of control over outcomes, obscuring the role of chance and other factors in the decision-making process.

What are some common examples of hindsight bias?

Common examples of hindsight bias include the belief that the outcome of an event, such as an election or a sports game, was predictable after it has happened, even though it was uncertain beforehand. Another example is when people look back on their own decisions, believing they knew the best course of action all along, despite having had limited information at the time.

How can hindsight bias impact financial decision-making?

In the world of finance, hindsight bias can lead to several problems, such as overconfidence in one’s ability to predict market trends or the success of specific investments. This can result in riskier financial decisions and a failure to consider alternative investment strategies. Additionally, hindsight bias can cause investors to misremember their past performance, believing they were more successful in predicting market changes than they actually were.

What techniques can be used to minimize hindsight bias?

Some techniques to minimize hindsight bias include acknowledging and documenting one’s initial predictions and reasons for making certain decisions, as well as seeking input from others to challenge one’s own viewpoints. Developing critical thinking skills and being open to alternative explanations can also help reduce the impact of hindsight bias. Continual self-reflection and questioning one’s own beliefs and assumptions can help identify and decrease this type of cognitive bias in decision-making.

How is hindsight bias studied in psychology?

In psychological research, hindsight bias is commonly studied through experimental methods in which participants are asked to make predictions or judgments about a situation before learning the outcome, and then are asked to recall or evaluate their initial predictions after being provided with the actual outcome. This allows researchers to assess how accurately participants remember their original predictions and identify any instances of hindsight bias. Various studies have also examined factors that can influence the degree of bias, such as the complexity of the situation, emotional involvement, and the perceived importance of the decision.

You may also like

Best Decision Making Books

Best Decision Making Books: Top Picks for Strategic Minds

In today’s fast-paced world, making informed and effective decisions is a skill that can have a profound impact on both personal and […]

metacognition and critical thinking

Metacognition & Critical Thinking: Differences and Similarities

Two terms we usually confuse with each other are Critical Thinking and Metacognition. Even though they both describe the skill of being […]

Critical thinking and Anxiety

Critical thinking and Anxiety

Have you ever wondered how critical thinking and anxiety relate to each other, and whether critical thinking can reduce problems with stress? […]

Critical Thinking in Leadership

Critical Thinking in Leadership

Every business leader has learned from past economic crisis. The previous global financial crisis has taught us a powerful lesson about poor […]

Cognitive Bias: How We Are Wired to Misjudge

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?

Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?

Or have you ever found yourself only reading news stories that further support your opinion?

These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.

Cognitive Bias and Judgement Error - Systematic Mental Pattern of Deviation from Norm or Rationality in Judgement - Conceptual Illustration

What is Cognitive Bias?

Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.

Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.

Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.

In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).

Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).

However, these biases can often be dangerous when they take the form of conscious stereotyping.

On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).

Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).

Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.

Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.

Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.

Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.

Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.

Cognitive biases as systematic error in thinking and behavior outline diagram. Psychological mindset feeling with non logic judgment effects vector illustration.

Confirmation Bias

Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.

Real-World Examples

Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.

This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.

Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.

Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.

Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.

Experiments

The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.

People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.

But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).

Explanations

But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).

This motivational explanation is often coupled with a more cognitive theory.

The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).

Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.

As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).

Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).

Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.

Hindsight Bias

Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.

 Hindsight Bias Example

When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.

And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.

Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).

You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.

It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.

Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.

The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.

After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.

Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.

That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.

Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.

But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.

Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.

From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).

It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.

Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).

When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.

Self-Serving Bias

Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.

You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.

The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.

In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.

On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.

From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).

In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).

Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.

Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).

The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).

We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).

Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).

An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.

Anchoring Bias

individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.

For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.

 Anchoring Bias Example

Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.

When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.

When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.

Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.

The key elements that demonstrate anchoring bias here are:

  • Sarah establishes an initial reference price based on the first listing she sees ($19k)
  • She uses that initial price as her comparison/anchor for evaluating subsequent prices
  • This biases her perception of the market value of the cars she looks at after the initial anchor is set
  • She makes a purchase decision aligned with her anchored expectations rather than a more objective market value

Multiple theories seek to explain the existence of this bias.

One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).

And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.

Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).

A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.

Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).

The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…

Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.

They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).

This demonstrates how the initial few calculations influenced the participant’s final answer.

Availability Bias

Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.

In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.

But instead of being a memory fabrication, it is an overemphasis on a certain memory.

In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.

Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).

A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.

Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).

As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.

And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.

Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.

Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.

This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.

Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).

In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.

Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.

Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.

Inattentional Blindness

A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.

For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.

Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.

Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.

Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.

The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.

Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.

In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.

A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).

Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.

The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.

Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.

Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.

A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.

Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).

The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).

By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.

Preventing Cognitive Bias

As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.

From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.

An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).

This multifactorial process involves (Croskerry, 2003):

(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.

Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.

The History of Cognitive Bias

The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).

Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).

As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.

Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.

To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).

Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.

After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).

Key Takeaways

  • Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
  • These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
  • Confirmation bias , hindsight bias, mere exposure effect , self-serving bias , base rate fallacy , anchoring bias , availability bias , the framing effect ,  inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect .
  • Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
  • Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.

Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.

Casad, B. (2019). Confirmation bias . Retrieved from https://www.britannica.com/science/confirmation-bias

Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from https://www.verywellmind.com/availability-heuristic-2794824

Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020

Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.

Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.

Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.

Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.

Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.

Inman, M. (2016). Hindsight bias . Retrieved from https://www.britannica.com/topic/hindsight-bias

Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/

Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/

Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.

Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.

Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.

Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.

Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.

Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.

Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.

Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.

Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.

Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
  • Understanding Your Racial Biases With John Dovidio, PhD, Yale University From the American Psychological Association11:09 minutes; includes transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

CONNECTIONS

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

hindsight bias in critical thinking

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

hindsight bias in critical thinking

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Hindsight Bias

Hindsight bias describes the tendency that people have – once an outcome is known – to believe that they predicted (or could have predicted) an outcome that they did not (or could not) predict. Sometimes referred to as the “knew-it-all-along” effect, it describes times when people conflate an outcome with what they knew at the time. People experiencing hindsight bias “think that they should have known something, or did know something, that would have led them to act differently had they paid more attention to it”, and it is particularly common in survivors of trauma. The Hindsight Bias information handout forms part of the cognitive distortions series, designed to help clients and therapists to work more effectively with common thinking biases.

hindsight bias in critical thinking

Download or send

Choose your language, professional version.

A PDF of the resource, theoretical background, suggested therapist questions and prompts.

Premium Feature

Client version.

A PDF of the resource plus client-friendly instructions where appropriate.

Translation Template

Are you a qualified therapist who would like to help with our translation project?

Related resources

hindsight bias in critical thinking

Cognitive Distortions – Unhelpful Thinking Styles (Extended)

hindsight bias in critical thinking

Cognitive Distortions – Unhelpful Thinking Styles (Common)

hindsight bias in critical thinking

Before I Blame Myself And Feel Guilty

Languages this resource is available in.

  • English (GB)
  • English (US)
  • Portuguese (Brazilian)

Problems this resource might be used to address

  • Grief, loss & bereavement
  • Self-esteem & self-criticism

Techniques associated with this resource

  • Cognitive restructuring
  • Psychoeducation

Mechanisms associated with this resource

  • Cognitive distortion

Introduction & Theoretical Background

A brief introduction to cognitive distortions

Cognitive distortions, cognitive biases, or ‘unhelpful thinking styles’ are the characteristic ways our thoughts become biased (Beck, 1963). We are always interpreting the world around us, trying to make sense of what is happening. Sometimes our brains take ‘shortcuts’ and we think things that are not completely accurate. Different cognitive short cuts result in different kinds of bias or distortions in our thinking. Sometimes we might jump to the worst possible conclusion (“this rough patch of skin is cancer!”), at other times we might blame ourselves for things that are not our fault (“If I hadn’t made him mad he wouldn’t have hit me”), and at other times we might rely on intuition and jump to conclusions (“I know that they all hate me even though they’re being nice”). These biases are often maintained by characteristic unhelpful assumptions (Beck et al., 1979).

Different cognitive biases are associated with different clinical presentations. For example, catastrophizing is associated with anxiety disorders (Nöel et al, 2012), dichotomous thinking has been linked to emotional instability (Veen & Arntz, 2000), and thought-action fusion is associated with obsessive compulsive disorder (Shafran et al, 1996).

Catching automatic thoughts and (re)appraising them is a core component of traditional cognitive therapy (Beck et al, 1979; Beck, 1995; Kennerley, Kirk, Westbrook, 2007). Identifying the presence and nature of cognitive biases is often a helpful way of introducing this concept – clients are usually quick to appreciate and identify with the concept of ‘unhelpful thinking styles’, and can easily be trained to notice the presence of biases in their own automatic thoughts. Once biases have been identified, clients can be taught to appraise the accuracy of these automatic thoughts and draw new conclusions.

Hindsight bias

Hindsight bias is sometimes referred to as the “knew-it-all-along” effect. Once an outcome is known, people with this bias are likely to believe that they predicted (or could have predicted) an outcome that they did not (or could not) predict (Fischhoff, 1975). In other words, people often have a tendency to conflate an outcome with what they knew at the time. People experiencing hindsight bias “think that they should have known something, or did know something, that would have led them to act differently had they paid more attention to it” (Young et al., 2021).

Examples of hindsight bias include:

  • An individual who was on a train that was attacked by suicide bombers states, “I knew I should have got on a different train that morning, I had a funny feeling about it.”
  • A parent whose child died from a rare infection, and who (at the time) had no reason to suspect that their symptoms were anything other than a sore throat, says, “I knew something was wrong that day. If I had done something about it my child would have survived”.
  • A woman whose husband subjected her to domestic violence asserts that, “I knew I shouldn’t have married him. I should have run the moment I met him”.
  • A man who was bullied at work states, “I should never have taken the job – I should have stayed where I was”.

Basic psychological research (e.g., Nestler et al., 2010) suggests that there are three kinds of hindsight bias, which Roese and Vohs (2012) conceptualize as a hierarchy. At the bottom level sits memory distortion, which causes earlier judgements to be misremembered. An intermediate ‘inevitability’ level involves beliefs about the state of the world and the predetermination of events (e.g., “Under the circumstances, no different outcome was possible”). At the top level, ‘foreseeability’ describes beliefs about one’s own knowledge and abilities (e.g., “I knew it would happen”). Clinical approaches for working with hindsight bias might address one or all of these levels.

People who experience hindsight bias may have ‘blind spots’ when it comes to:

  • Identifying alternative causes for an event or a chain of events.
  • Acknowledging or tolerating the feelings of uncertainty when recalling the time which preceded an event.
  • Accepting the doubt inherent in the judgements that people make.
  • Assessing how much influence they had and how responsible they were for events and outcomes.
  • Self-compassion and understanding when things go badly.

As with other cognitive biases, it can be helpful to consider the function of hindsight bias. Some authors propose that hindsight bias is a by-product of the human capacity for adaptive learning (Pohl et al., 2002). Others suggest that hindsight bias results from a ‘need for closure’, arguing that “people have a need to see the world as predictable and find it threatening to believe that many outcomes are at the mercy of unknown, random chance” (Roese & Vohs, 2012). Furthermore, there may be individual differences in peoples’ predisposition to hindsight bias. For example, evidence suggests that people with dispositionally greater ‘need for control’ or ‘need for closure’ show greater hindsight bias (Campbell et al., 2003; Tykncinski, 2001).

Hindsight bias is associated with a wide range of clinical problems, including:

  • Complicated grief (Fleming & Robinson, 2001; Simon et al, 2017).
  • Depression (Gross et al., 2017).
  • Guilt (Kubany, 1997).
  • Problem gambling (Toneatto, 1999; Toneatto & Gunarate, 2009).
  • Psychosis (Woodward et al, 2006).
  • Post-traumatic stress disorder (PTSD) (Kubany, 1994).
  • Regret (Blank & Peter, 2010; Gross et al., 2017)).
  • Self-criticism (Kubany & Manke, 1995).
  • Survivor guilt (Murray et al., 2021).

Therapist Guidance

Many people struggle with hindsight bias. It sounds as though this might also be relevant to you. Would you be willing to explore it with me?

Clinicians may consider giving clients helpful psychoeducation about automatic thoughts more generally and hindsight bias in particular. Consider sharing some of these important details:

  • Automatic thoughts spring up spontaneously in your mind in the form of words or images.
  • They are often on the ‘sidelines’ of our awareness. With practice, we can become more aware of them. It is a bit like a theatre – we can bring our automatic thoughts ‘centre stage’.  
  • Automatic thoughts are not always accurate: just because you think something, it doesn’t make it true.
  • Hindsight bias is a common type of bias that can show up in our automatic thoughts.
  • Signs that hindsight bias is present might include feelings of guilt, shame, regret, or self-blame. The thoughts that accompany these feelings often contain judgmental descriptions, such as “I knew it…”, “Why didn’t I…”, “I should have…”.
  • Hindsight bias can happen for different reasons. Sometimes it arises when we misremember what we knew or how we felt when we made a decision, but there are other motivations for believing something that is factually untrue. For instance, we might prefer to blame ourselves for events because it increases our sense of control.

Many treatment techniques are helpful for working with hindsight bias:

  • Decentering. Meta-cognitive awareness, or decentering, describes the ability to stand back and view a thought as a cognitive event: as an opinion, and not necessarily a fact (Flavell, 1979). Help clients to practice labeling the process present in the thinking rather than engaging with the content. For instance, saying to themselves, “That sounds like hindsight bias again”, whenever they notice this style of thinking. 
  • Explore what the client thinks they should have known, or should have paid more attention to, at the time of the event, which may have prevented it from occurring.
  • Ask the client to recall the exact moment when they made the decision to act/respond in the way they did. Clarify who was there, what was happening, and sensory details (e.g., what they could see, hear, smell, touch, feel). Ask them to concentrate on the moment they made the decision and state what they thought would happen at that moment.
  • Summarize this information as a “what I thought at the time” statement or position that was understandable given the circumstances of the decision.
  • Creating a clear narrative and filling gaps in memory. Research indicates that some forms of hindsight bias are linked to memory distortions, which results in misrecollection. Clients who have experienced trauma often have gaps in their memory or find struggle to recall the particular sequence of events leading up to a key decision or action. They may experience frequent involuntary memories of points where they chose to act (or not act) but fail to recall the sequence of events that led to that choice or action (Hellawell & Brewin, 2004). In this case, ask the client relive the events in chronological order while helping them recall what they knew at different points in the sequence. Explore how sure they were of different predictions and inferences at each point (e.g., “At the moment when your partner got mad and left the room, what did you think he would do?”, “At that moment, how certain were you that he would die later?”, “What happened next?”).
  • Clarifying beliefs and reasons – saying the unsaid. Guilt and self-blame can be ‘slippery’. It can be helpful to ask the client to make a clear statement about what they feel guilty about, why they feel guilty, and to rate the strength of their belief. Young and colleagues (2021) describe how there is a big difference between the statements “I should not have decided to keep quiet, I should have told my parents after the first time he raped me” and “I should not have decided to keep quiet, I should have told my parents after the first time he raped me, and because I did not say anything, I am responsible for him raping me again”.
  • Assessing responsibility with pie charts. Hindsight bias can lead to excessive responsibility taking and self-blame. If this is the case, a responsibility pie chart can be used to distribute responsibility more fairly and help clients appreciate that most events have multiple causes.
  • Discussing issues related to knowledge and blame. Young and colleagues (2021) use a story-telling approach to discuss the roles of knowledge and blame: “Imagine that you have someone staying in your house who has never seen electrical equipment before, perhaps they have always lived very remotely or are an alien from another planet. They are an adult, of normal intelligence, and with normal memory capacity. They come down for breakfast on the first morning that they are staying with you and see you are ironing. They are intrigued by the shiny metal object with a red light on it and touch it with their hand. Would you blame them for the burn they get from touching the iron? Imagine that the next morning the same person comes downstairs, with a bandage on their hand from the day before. Remember, they do not have a memory problem and are of normal intelligence, and they touch the iron again. Would you blame them for the second burn? Why would you blame them for the second burn and not the first? It seems that you are making a judgement about the relationship between what you know and whether you are to blame. Can you tell me more, what do you think is the relationship between knowledge and blame?”
  • Using analogies (such as how courts allocate blame and responsibility). If a client has a high level of accountability, they may find it helpful to reflect on how courts judge responsibility. When courts judge an individual’s intentions or state of mind, they will often consider what the defendant knew at the time. They look to see whether an action happened purposefully (the defendant consciously desired the result), knowingly (the defendant was ‘practically certain’ that the result would happen), recklessly (the defendant consciously disregards substantial and unjustifiable risk), or negligently (a ‘reasonable person’ ought to be aware that there was a substantial and unjustifiable risk). In terms of the burden of proof, a jury must be certain (often ‘beyond reasonable doubt’) before deciding to convict. Accordingly, if the defendant does not meet the thresholds for having acted ‘purposefully’, ‘knowingly’, ‘recklessly’, or ‘negligently’, they are likely to be found not guilty.
  • Acknowledging ‘impossible choices’. When clients blame themselves unfairly, they often believe that there was a better way of acting which they chose not to do. Young and colleagues (2021) describe these as ‘impossible choices’, while Kubany (1994) and Norman et al (2019) refer to them as ‘Catch 22 guilt’ – the idea that when people are faced with two bad choices (e.g., leaving someone to die or dying oneself), they will usually choose the least-bad outcome. Kubany (1994) recommends asking the client to recall the moment they made their decision as well as all the alternative choices they could have made. Ask the client to consider what they thought were the advantages and disadvantages of each choice at the time (instead of with hindsight), both in the short- and long-term. Waltman (2021) gives an example of helping a client whose daughter was killed to explore – in detail, and using maps drawn on a whiteboard – a number of counterfactual scenarios in which he acted differently. He reported that “this helped him to see that there was literally nothing he could have done in the situation and that in reality he was lucky to have escaped with his life”.
  • Psychoeducation. Hindsight bias may arise from a failure to recall the feelings of uncertainty when making decisions under conditions of stress and uncertainty. Schauer and Elbert (2010) have described the automatic processes that often occur in threatening situations. For example, freezing, fleeing, fighting, pleading, or dissociating can all take place automatically – and for good reason given our evolutionary history and instinct to survive. Trauma survivors may fail to recall how they felt when a critical decision was made. Unfortunately, this failure to recall (or avoidance) increases the likelihood of judging an action with hindsight. Helping clients engage with how they felt emotionally and physiologically when making a critical decision can help them appreciate how severe stress may have impacted their judgment. 
  • Surveys. Surveys are helpful when a client is judging themselves according to a strict or harsh standard, and has not sought the opinions of other people regarding their culpability. Survey formats can vary but usually include a short vignette describing the event or decision in question, followed by questions such as “How much do you think this person is to blame for this event?”. Clients should predict how other people will respond to these questions prior to viewing the results.

References And Further Reading

  • Beck, A. T. (1963). Thinking and depression: I. Idiosyncratic content and cognitive distortions. Archives of General Psychiatry , 9, 324-333. DOI: 10.1001/archpsyc.1963.01720160014002.
  • Beck, A. T., Rush, A. J., Shaw, B. F., & Emery, G. (1979). Cognitive therapy of depression . Guilford Press. 
  • Beck, J. S. (1995). Cognitive behavior therapy: Basics and beyond . Guilford Press. 
  • Blank, H., & Peters, J. H. (2010). Controllability and hindsight components: Understanding opposite hindsight biases for self-relevant negative event outcomes. Memory and Cognition , 38, 356–65. DOI:10.3758/MC.38.3.356.
  • Fischhoff, B. (1975). Hindsight ≠ foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance , 1, 288-299. DOI: 10.1136/qhc.12.4.304.
  • Fleming, S., & Robinson, P. (2001). Grief and cognitive–behavioral therapy: The reconstruction of meaning. In M. S. Stroebe, R. O. Hansson, W. Stroebe, & H. Schut (Eds.), Handbook of bereavement research: Consequences, coping, and care (pp. 647–669) . American Psychological Association. 
  • Groß, J., Blank, H., & Bayen, U. J. (2017). Hindsight bias in depression. Clinical Psychological Science , 5, 771-788. DOI: 10.1177/2167702617712262.
  • Kubany, E. S. (1994). A cognitive model of guilt typology in combat‐related PTSD. Journal of Traumatic stress , 7, 3-19. DOI: 10.1002/jts.2490070103.
  • Kubany, E. S. (1997). Thinking errors, faulty conclusions, and cognitive therapy for trauma-related guilt. National Center for Post-Traumatic stress Disorder Clinical Quarterly , 7, 1-4.
  • Kubany, E. S. (1997). Application of cognitive therapy for trauma-related guilt (CT-TRG) with a Vietnam veteran troubled by multiple sources of guilt. Cognitive and Behavioral Practice , 4, 213-244. DOI: 10.1016/S1077-7229(97)80002-8.
  • Kubany, E. S., & Manke, F. P. (1995). Cognitive therapy for trauma-related guilt: Conceptual bases and treatment outlines. Cognitive and Behavioral Practice , 2, 27-61. DOI: 10.1016/S1077-7229(05)80004-5.
  • Murray, H., Kerr, A., Warnock-Parkes, E., Wild, J., Grey, N., Clark, D. M., & Ehlers, A. (2022). What do others think? The why, when and how of using surveys in CBT. The Cognitive Behaviour Therapist , 15, e42. DOI: 10.1017/S1754470X22000393.
  • Murray, H., Pethania, Y., & Medin, E. (2021). Survivor guilt: a cognitive approach. The Cognitive Behaviour Therapist , 14, e28. DOI: 10.1017/S1754470X21000246.
  • Noël, V. A., Francis, S. E., Williams-Outerbridge, K., & Fung, S. L. (2012). Catastrophizing as a predictor of depressive and anxious symptoms in children. Cognitive Therapy and Research , 36, 311-320. DOI: 10.1007/s10608-011-9370-2.
  • Norman, S., Allard, C., Browne, K., Capone, C., Davis, B., & Kubany, E. (2019). Trauma informed guilt reduction therapy: Treating guilt and shame resulting from trauma and moral injury . Academic Press.
  • Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science , 7, 411-426. DOI:
  • Schauer, M., & Elbert, T. (2015). Dissociation following traumatic stress. Journal of Psychology , 7, 411-426. DOI: 10.1177/1745691612454303.
  • Shafran, R., Thordarson, D. S., & Rachman, S. (1996). Thought-action fusion in obsessive compulsive disorder. Journal of Anxiety Disorders , 10, 379-391. DOI: 10.1016/0887-6185(96)00018-7.
  • Simon, N. M., O’Day, E. B., Hellberg, S. N., Hoeppner, S. S., Charney, M. E., Robinaugh, D. J., … Rauch, S. A. M. (2017). The loss of a fellow service member: Complicated grief in post-9/11 service members and veterans with combat-related posttraumatic stress disorder. Journal of Neuroscience Research , 96, 5–15. DOI: 10.1002/jnr.24094.
  • Toneatto, T. (1999). Cognitive psychopathology of problem gambling. Substance Use and Misuse , 34, 1593-1604. DOI: 10.3109/10826089909039417.
  • Toneatto, T., & Gunaratne, M. (2009). Does the treatment of cognitive distortions improve clinical outcomes for problem gambling? Journal of Contemporary Psychotherapy , 39, 221-229. DOI: 10.1007/s10879-009-9119-3.
  • Veen, G., & Arntz, A. (2000). Multidimensional dichotomous thinking characterizes borderline personality disorder. Cognitive Therapy and Research , 24, 23-45. DOI: 10.1023/A:1005498824175.
  • Waltman, S. H. (2020). Introduction: Why Use Socratic Questioning?. In: Waltman, S. H., et al. (2020). Socratic Questioning for Therapists and Counselors (pp. 1-7). Routledge.
  • Westbrook, D., Kennerley, H., & Kirk, J. (2011). An introduction to cognitive behaviour therapy: Skills and applications (2nd ed.) . Sage.
  • Woodward, T. S., Moritz, S., Arnold, M. M., Cuttler, C., Whitman, J. C., & Lindsay, D. S. (2006). Increased hindsight bias in schizophrenia. Neuropsychology , 20, 461–467. DOI:10.1037/0894-4105.20.4.461.
  • Young, K., Chessell, Z. J., Chisholm, A., Brady, F., Akbar, S., Vann, M., ... & Dixon, L. (2021). A cognitive behavioural therapy (CBT) approach for working with strong feelings of guilt after traumatic events. The Cognitive Behaviour Therapist , 14, e26. 10.1017/S1754470X21000192.
  • For clinicians
  • For students
  • Resources at your fingertips
  • Designed for effectiveness
  • Resources by problem
  • Translation Project
  • Help center
  • Try us for free
  • Terms & conditions
  • Privacy Policy
  • Cookies Policy

hindsight bias in critical thinking

  • Clearer Thinking Team
  • Mar 30, 2023

A List of Common Cognitive Biases (With Examples)

Updated: Jun 13, 2023

list of cognitive biases

Cognitive biases are patterns of thinking that distort or skew information processing, often leading to errors. These biases often occur when we make a quick decision using intuition or heuristics, which are simple rules or shortcuts that we use to make decisions and solve problems quickly without necessarily considering all available information.

While human intuition is extremely useful for many things, and should not simply be ignored, there are also plenty of known situations in which using our intuition or "going with our gut" systematically leads us to inaccurate conclusions and unhelpful behaviors.

In the early 1970s, cognitive psychologists Amos Tversky and Daniel Kahneman introduced the term 'cognitive bias' after studying perceptual bias in problem-solving that used heuristics. Since then, cognitive psychology has demonstrated that cognitive biases occur systematically and universally and are involuntary: no one is totally immune to them.

hindsight bias in critical thinking

If you've found this article valuable so far, you may also like our free tool

List of the most common cognitive biases

Here, we list many of the most common cognitive biases. We strongly recommend reading the second part of this article, where we answer popular questions and clarify common misunderstandings about the topic.

Ambiguity Effect

The Ambiguity Effect is a cognitive bias whereby people who are faced with a decision tend to pick an option for which they know the probability of a good outcome, rather than an option for which the probability of a good outcome is unknown or ambiguous. This may occur even if the known probability is low and picking it isn't the best strategy.

Anchoring Bias

Anchoring Bias occurs when a person's expectation about one thing is affected by something mostly or entirely irrelevant they saw, heard, or thought before, such as an irrelevant number. In other words, it occurs when a person's beliefs or behaviors are influenced by a specific piece of information far more than they should be given how much evidence that information actually provides.

Attention Bias

Attention Bias occurs when some information or evidence holds a disproportionate amount of a person's attention because of that person's environment or history, or because of people's natural instincts.

Availability Bias

The Availability Bias occurs when someone's prediction about an event's frequency or probability is unduly influenced by how easily they can recall examples of that event. We have a whole mini-course about combating availability bias .

Bias Blind Spot

A Bias Blind Spot is a tendency to see oneself as being less biased or less susceptible to biases (such as those listed in this article) than others in the population.

Choice-Supportive Bias

Choice-Supportive Bias is a cognitive bias whereby someone who has chosen between different options later remembers the option that they chose as having more positive attributes than it did at the time (while they remember options they did not choose as having more negative attributes than they'd had at the time).

Confirmation Bias

Confirmation Bias refers to a tendency for people to seek out, favor, or give more weight to information that confirms their preconceptions or hypotheses (even if the information isn't true) than information that contradicts their prior beliefs.

Denomination Effect

The Denomination Effect is a cognitive bias whereby people tend to be more likely to spend a given amount of money if it is composed of smaller individual sums than if it is composed of larger individual sums.

Hindsight Bias

Hindsight Bias refers to a tendency to perceive past events as being more predictable than they were before they took place.

Optimism Bias

Optimism Bias is the tendency to be unduly optimistic about the probability of future good and bad events, overestimating the probability of positive ones while underestimating the probability of negative ones.

Motivated Reasoning

Motivated reasoning occurs when you are disposed to interpret new evidence in ways that support your existing beliefs, or that lead to the outcome you wish was true, even when that evidence doesn't truly support your beliefs.

Frequently Asked Questions (FAQ) about cognitive biases

What are the types of bias.

There are three main types of bias.

1. Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X.

2. Implicit biases are unconscious beliefs that lead people to form opinions or judgments, often without being fully aware they hold the unconscious beliefs. If you subtely distrust people of group X without even realizing you're doing it, then you have an implicit bias against people of group X.

3. Cognitive biases differ from explicit and implicit biases: they are a group of systematic patterns in how our beliefs, judgments, and actions differ from what they would if we were completely rational. If most people systemtaically misjudge certain types of information in such a way that you come to false conclusions, then people have a cognitive bias related to that type of information.

How many cognitive biases are there?

There is no consensus among academics regarding how many cognitive biases exist. Some have found ~40 , others find >100 , and Wikipedia lists over 180 .

What are the common causes of cognitive bias?

As we’ve seen above, cognitive biases often appear when one is faced with a decision and has limited resources (such as time, understanding, and cognitive capacity).

For instance, when buying a banana, you can't consider every single possible other use of that money to determine whether a banana is truly the single best use. You are limited in both how much time you have to think and how much total cognitive capacity you have.

Using fast heuristics or relying on our intuition is often an effective way of coming to conclusions in these situations because such approaches require fewer resources than careful thinking. While our intuition is often reliable, there are certain cases where our intuitions systematically produce inaccurate beliefs and unhelpful behaviors - these are what we refer to as "cognitive biases".

Even when we have plenty of time to think and aren't hitting a limit on our cognitive resources, people can still be prone to cognitive biases. For instance, there are certain automatic rules of thumb that our minds evolved to use since they worked quite well for the survival of our ancestors. Unfortunately, these rules of thumb can sometimes lead us to false conclusions and unhelpful behaviors in the modern world.

Is cognitive bias a good or bad thing?

Cognitive biases are not good or bad in themselves. They are an unavoidable effect of not having infinite intelligence and infinite time to think, and hence the need to rely on heuristics and intuition. We call a tendency a cognitive bias when it leads to systemic inaccuracies in our beliefs or unhelpful behaviors. In that sense, by definition, cognitive biases cause systematic problems.

However, cognitive biases do not always lead to negative outcomes in every instance. For instance, overconfidence may cause a person to try something very difficult, that they ultimately succeed at. On the other hand, for every one person who succeeds due to overconfidence, there may be multiple other people that try something that's unrealistic due to overconfidence and end up failing.

How do you identify cognitive biases?

Just knowing about specific cognitive biases is a great first step to identifying them in yourself, but knowledge of the biases is often not sufficient to cause you to identify them. Once you’ve done that, it can be helpful to get to know the most common cognitive biases (such as the ones presented above) so that you can look out for them in your own thinking.

Can you avoid cognitive bias?

Yes and no. It is possible to reduce the influence of cognitive biases on your thinking (and this can be very beneficial!). So you may be able to avoid a cognitive bias in many particular instances. But it's not possible to completely remove all of your cognitive biases.

How do you overcome cognitive biases?

Unfortunately, it’s impossible to overcome all of your cognitive biases completely. However, that doesn’t mean you can’t do anything. A good first step on the path to getting your cognitive biases under control is familiarizing yourself with them

Here are a few of our interactive tools that might help:

The Planning Fallacy

The Sunk Cost Fallacy

Improve Your Frequency Predictions

Political Bias Test

Rhetorical Fallacies

Are Your Overconfident?

Calibrate Your Judgement

How Rational Are You, Really?

Metal Traps ,

However, just knowing about your cognitive biases isn’t enough . You need to take action! Here are some practical steps we recommend:

Biases such as overconfidence, confirmation bias, and the illusion of control can be reduced or avoided by having multiple points of view. Surrounding yourself and listening to people with diverse experiences, systems of beliefs, and expertise reduces the chances of falling into one of the said biases. This is also true for the source of information: it is less likely that you fall into a cognitive bias if you look for other data sources and conflict.

Actively seeking evidence against your current point of view (on important decisions) can be a helpful way to combat biases like overconfidence, confirmation bias, and motivated reasoning.

Another strategy recommended by researchers who studied cognitive biases in physicians, is to consciously consider the options you dismissed at first, so you can reach a more considered answer.

What is a cognitive vs. an emotional bias?

Emotional biases can be considered a subcategory of cognitive biases. What separates them from other cognitive biases is that they are based on e motions such as anger, disgust, fear, happiness, sadness, and surprise . When we're experiencing emotions, we may act in a biased way that is concordant with that emotion. For instance, anxiety may cause us to overestimate the chance of something being dangerous.

Emotional biases are linked to emotional dispositions (commonly known as ‘temperament’). Different emotional dispositions may even lead to different emotional reactions to the same occurrence of events.

Emotional biases may help us explain optimism and pessimism biases .

How do cognitive biases affect critical thinking ?

Cognitive biases interfere with impartiality, and they can negatively impact critical thinking in a myriad of different ways. Here are several:

Motivated reasoning leads us to underestimate the arguments for conclusions we don’t believe in and overestimate the arguments for conclusions we want to believe;

Availability bias messes with our critical thinking because it leads us to asses risk by how readily examples come to mind, rather than considering all of the relevant examples;

We are also prone to blind spot bias, meaning that we are less likely to identify biases in our own judgment than in other people's.

How do cognitive biases affect decision-making?

Cognitive biases affect decision-making in at least two ways: they help decision-making by speeding it up and cutting necessary corners when we have limited time or cognitive power, but they also hinder decision-making by causing us to come to false conclusions or take unhelpful actions in certain cases.

Is gender a factor for cognitive biases?

Research has shown some correlation between gender or sex and specific biases. For instance, researchers found that male investors tend to show greater overconfidence and optimism biases, while female investors tend to exhibit more anchoring and hindsight biases. The research makes no claims about what causes such gendered differences - e.g., socialization or biology or a mix of both.

Are gender stereotypes cognitive bias?

Gender stereotypes are explicit biases, which means they are not cognitive biases. However, there are many cognitive biases that involve gender stereotypes. For example, masculine bias is the tendency to assume a person is a male based on stereotypes after hearing gender-neutral information about them, and the tendency to use gender as a description only when describing women.

Gender stereotypes are also a sign of binary thinking .

Do cognitive biases cause depression?

Research has shown some cognitive biases are correlated with depression . This has been found to be the case for negative interpretation bias (the tendency to interpret ambiguous scenarios as negative) and pessimistic biases, which lead people to predict future situations as unrealistically negative.

Cognitive behavioral therapy is based on the assumption that individuals with depression have distorted negative beliefs about themselves or the world (known in CBT as "cognitive distortions").

Are cognitive biases scientific (is their existence scientifically proven)?

Yes. They have been studied since the early 1970s by cognitive psychologists, sociologists, and behavioral economists.

Do scientists exhibit cognitive biases?

Just like every other human being, scientists can exhibit cognitive biases.They may exhibit overconfidence bias or fall prety to selection biases, for example. This has been researched as it relates to the replication crisis social psychology faces today .

There is even research on the presence of cognitive biases in scientific contexts and occuring within academic publications. Nobody, not even scientists, are immune to cognitive biases!

Are cognitive biases learned? Or are we born with cognitive biases?

Both. We are born with a tendency for some cognitive biases, but we can also learn specific aspects of these biases. Our brains have evolved to be prone to all sorts of cognitive biases because those biases have been helpful in the survival of our ancestors in the environment (and under the constraints) in which they lived.

But the details of some specific cognitive biases are learned as we move through the world. For example, humans have evolved a tendency to engage in motivated reasoning, but which conclusions motivate your reasoning is something you aren’t born with and are impacted by your experiences and learning.

Keep learning by trying our mini-course on Mental Traps

Want to understand cognitive biases on a deeper level? Learn about a few of the mind's mistakes with our interactive introduction to cognitive biases!

Recent Posts

10 Times Scientists Admitted They Were Wrong, and What You Can Learn from Them

How to Spot Real Expertise

Understanding the Progressive and Conservative Psyche

Christopher Dwyer Ph.D.

12 Common Biases That Affect How We Make Everyday Decisions

Make sure that the decisions that matter are not made based on bias..

Posted September 7, 2018 | Reviewed by Matt Huston

  • Confirmation bias means that people favor ideas that confirm their existing beliefs.
  • People overestimate the likelihood of positive outcomes if they're in a good mood, which is optimism bias.
  • Declinism refers to a bias in favor of the past, due to a resistance to change.

Though the concept of illusory superiority arguably dates back to Confucius and Socrates, it may come as a shock that its discussion in the form of the Dunning-Kruger Effect is almost 20 years old; and though it may simply be a result of an echo chamber created through my own social media , it seems to be popping up quite frequently in the news and posts that I’ve been reading lately—even through memes . For those of you unfamiliar with the phenomenon, the Dunning-Kruger Effect refers to a cognitive bias in which individuals with a low level of knowledge in a particular subject mistakenly assess their knowledge or ability as greater than it is. Similarly, it also refers to experts underestimating their own level of knowledge or ability.

But, then again, maybe it’s not my echo chamber—maybe it is part and parcel of our new knowledge economy (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014) and the manner in which we quickly and effortlessly process information (right or wrong) with the help of the internet. In any case, given the frequency with which I seem to have encountered mention of this cognitive bias lately, coupled with the interest in my previous blog post " 18 Common Logical Fallacies and Persuasion Techniques ," I decided it might be interesting to compile a similar list—this time, one of cognitive biases .

A cognitive bias refers to a "systematic error" in the thinking process. Such biases are often connected to a heuristic, which is essentially a mental shortcut—heuristics allow one to make an inference without extensive deliberation and/or reflective judgment, given that they are essentially schemas for such solutions (West, Toplak, & Stanovich, 2008). Though there are many interesting heuristics out there, the following list deals exclusively with cognitive biases. Furthermore, these are not the only cognitive biases out there (e.g. there’s also the halo effect and the just world phenomenon ); rather, they are 12 common biases that affect how we make everyday decisions, from my experience.

1. The Dunning-Kruger Effect

In addition to the explanation of this effect above, experts are often aware of what they don’t know and (hopefully) engage their intellectual honesty and humility in this fashion. In this sense, the more you know, the less confident you're likely to be—not out of lacking knowledge, but due to caution. On the other hand, if you know only a little about something, you see it simplistically—biasing you to believe that the concept is easier to comprehend than it may actually be.

2. Confirmation Bias

Just because I put the Dunning-Kruger Effect in the number one spot does not mean I consider it the most commonly engaged bias—it is an interesting effect, sure; but in my critical thinking classes, the confirmation bias is the one I constantly warn students about. We all favour ideas that confirm our existing beliefs and what we think we know. Likewise, when we conduct research, we all suffer from trying to find sources that justify what we believe about the subject. This bias brings to light the importance of, as I discussed in my previous post on " 5 Tips for Critical Thinking ," playing devil’s advocate . That is, we must overcome confirmation bias and consider both sides (or, if there are more than two, all sides) of the story. Remember, we are cognitively lazy—we don’t like changing our knowledge (schema) structures and how we think about things.

3. Self-Serving Bias

Ever fail an exam because your teacher hates you? Ever go in the following week and ace the next one because you studied extra hard despite that teacher? Congratulations, you’ve engaged the self-serving bias. We attribute successes and positive outcomes to our doing, basking in our own glory when things go right; but, when we face failure and negative outcomes, we tend to attribute these events to other people or contextual factors outside ourselves.

4. The Curse of Knowledge and Hindsight Bias

Similar in ways to the availability heuristic (Tversky & Kahneman, 1974) and to some extent, the false consensus effect , once you (truly) understand a new piece of information, that piece of information is now available to you and often becomes seemingly obvious. It might be easy to forget that there was ever a time you didn’t know this information and so, you assume that others, like yourself, also know this information: the curse of knowledge . However, it is often an unfair assumption that others share the same knowledge. The hindsight bias is similar to the curse of knowledge in that once we have information about an event, it then seems obvious that it was going to happen all along. I should have seen it coming!

5. Optimism/Pessimism Bias

As you probably guessed from the name, we have a tendency to overestimate the likelihood of positive outcomes, particularly if we are in good humour, and to overestimate the likelihood of negative outcomes if we are feeling down or have a pessimistic attitude. In either the case of optimism or pessimism , be aware that emotions can make thinking irrational. Remember one of my " 5 Tips for Critical Thinking ": Leave emotion at the door.

6. The Sunk Cost Fallacy

Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty thinking, given the manner in which we think in terms of winning, losing, and breaking even. For example, we generally believe that when we put something in, we should get something out—whether it’s effort, time, or money. With that, sometimes we lose… and that’s it—we get nothing in return. A sunk cost refers to something lost that cannot be recovered. Our aversion to losing (Kahneman, 2011) makes us irrationally cling to the idea of regaining even though it has already been lost (known in gambling as chasing the pot —when we make a bet and chase after it, perhaps making another bet to recoup the original [and hopefully more] even though, rationally, we should consider the initial bet as out-and-out lost). The appropriate advice of cutting your losses is applicable here.

hindsight bias in critical thinking

7. Negativity Bias

Negativity bias is not totally separate from pessimism bias , but it is subtly and importantly distinct. In fact, it works according to similar mechanics as the sunk cost fallacy in that it reflects our profound aversion to losing. We like to win, but we hate to lose even more. So, when we make a decision, we generally think in terms of outcomes—either positive or negative. The bias comes into play when we irrationally weigh the potential for a negative outcome as more important than that of a positive outcome.

8. The Decline Bias (a.k.a. Declinism)

You may have heard the complaint that the internet will be the downfall of information dissemination; but, Socrates reportedly said the same thing about the written word. Declinism refers to a bias in favour of the past over and above "how things are going." Similarly, you might know a member of an older generation who prefaces grievances with, "Well, back in my day" before following up with how things are supposedly getting worse. The decline bias may result from something I’ve mentioned repeatedly in my posts—we don’t like change. People like their worlds to make sense, they like things wrapped up in nice, neat little packages. Our world is easier to engage in when things make sense to us. When things change, so must the way in which we think about them; and because we are cognitively lazy (Kahenman, 2011; Simon, 1957), we try our best to avoid changing our thought processes.

9. The Backfire Effect

The backfire effect refers to the strengthening of a belief even after it has been challenged. Cook and Lewandowsky (2011) explain it very well in the context of changing people’s minds in their Debunking Handbook . The backfire effect may work based on the same foundation as Declinism , in that we do not like change. It is also similar to negativity bias , in that we wish to avoid losing and other negative outcomes—in this case, one’s idea is being challenged or rejected (i.e. perceived as being made out to be "wrong") and thus, they may hold on tighter to the idea than they had before. However, there are caveats to the backfire effect—for example, we also tend to abandon a belief if there's enough evidence against it with regard to specific facts .

10. The Fundamental Attribution Error

The fundamental attribution error is similar to the self-serving bias , in that we look for contextual excuses for our failures, but generally blame other people or their characteristics for their failures. It also may stem from the availability heuristic in that we make judgments based only on the information we have available at hand.

One of the best textbook examples of this integrates stereotyping: Imagine you are driving behind another car. The other driver is swerving a bit and unpredictably starts speeding up and slowing down. You decide to overtake them (so as to no longer be stuck behind such a dangerous driver) and as you look over, you see a female behind the wheel. The fundamental attribution error kicks in when you make the judgment that their driving is poor because they’re a woman (also tying on to an unfounded stereotype). But what you probably don’t know is that the other driver has three children yelling and goofing around in the backseat, while she’s trying to get one to soccer, one to dance, and the other to a piano lesson. She’s had a particularly tough day and now she’s running late with all of the kids because she couldn’t leave work at the normal time. If we were that driver, we’d judge ourselves as driving poorly because of these reasons, not because of who we are. Tangentially, my wife is a much better driver than I am.

11. In-Group Bias

As we have seen through consideration of the self-serving bias and the fundamental attribution error , we have a tendency to be relatively kind when making judgments about ourselves. Simply, in-group bias refers to the unfair favouring of someone from one’s own group. You might think that you’re unbiased, impartial, and fair, but we all succumb to this bias, having evolved to be this way. That is, from an evolutionary perspective, this bias can be considered an advantage—favouring and protecting those similar to you, particularly with respect to kinship and the promotion of one’s own line.

12. The Forer Effect (a.k.a. The Barnum Effect)

As in the case of Declinism , to better understand the Forer effect (commonly known as the Barnum Effect ), it’s helpful to acknowledge that people like their world to make sense. If it didn’t, we would have no pre-existing routine to fall back on and we’d have to think harder to contextualise new information. With that, if there are gaps in our thinking of how we understand things, we will try to fill those gaps in with what we intuitively think makes sense, subsequently reinforcing our existing schema(s). As our minds make such connections to consolidate our own personal understanding of the world, it is easy to see how people can tend to process vague information and interpret it in a manner that makes it seem personal and specific to them. Given our egocentric nature (along with our desire for nice, neat little packages and patterns), when we process vague information, we hold on to what we deem meaningful to us and discard what is not. Simply, we better process information we think is specifically tailored to us, regardless of ambiguity. Specifically, the Forer effect refers to the tendency for people to accept vague and general personality descriptions as uniquely applicable to themselves without realizing that the same description could be applied to just about everyone else (Forer, 1949). For example, when people read their horoscope, even vague, general information can seem like it’s advising something relevant and specific to them.

While heuristics are generally useful for making inferences by providing us with cognitive shortcuts that help us stave off decision fatigue, some forms of heuristics can make our judgments irrational. Though various cognitive biases were covered in this post, these are by no means the only biases out there—just the most commonly engaged, in my experience, with respect to everyday decision-making . If you’re interested in learning more about these and other cognitive biases, I recommend checking out yourbias.is . Remember, we make thousands of decisions every day, some more important than others. Make sure that the ones that do matter are not made based on bias, but rather on reflective judgment and critical thinking.

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with foreword by former APA President, Dr. Diane F. Halpern.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Forer, B. R. (1949) "The Fallacy of Personal Validation: A classroom Demonstration of Gullibility," Journal of Abnormal Psychology, 44, 118-121.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kruger, J. &Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.

Simon, H. A. (1957). Models of man. New York: Wiley.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

hindsight bias in critical thinking

Hindsight Bias: Understanding The Feeling & Using It Productively For Future Relationships

T here is no such thing as water under the bridge when it comes to romance. A relationship entails a significant degree of investment on all fronts, which is why we tend to give it more credit than it merits both before and after a breakup. Reflecting on where a relationship went wrong helps us understand ourselves more and make better decisions moving forward. But hindsight isn't always the healthiest thing to do post-breakup.

After exiting a relationship, many of us fall into this destructive habit of rummaging through our memories to find all the red flags we think we missed  and are not satisfied until we get our hands on the details that are remotely indicative of the romance's doomed fate. These include "he's always playing chess. Of course, he has the checkerboard mind of a manipulative narcissist," or "he's a Scorpio and I'm a Gemini. It's written in the stars that we wouldn't last."

All at once, every memory becomes a missed red flag under the breakup goggles. You convince yourself your relationship was doomed to fail and you knew it. Then, you start blaming yourself for not seeing it sooner and convince yourself you'll do better next time because you've developed a sixth sense for predicting if a relationship will work. If this is you, you're dealing with a psychological phenomenon known as hindsight bias. Here's what to know about it.

Hindsight Bias Is Choosing To See The Old In A New Light

According to a study published by the American Psychological Association , hindsight bias is a psychological phenomenon that "occurs when outcome information distorts people's memories of past beliefs or exaggerates perceptions of outcomes' foreseeability or inevitability." In other words, people who engage in hindsight bias, when reflecting on an event in the past, convince themselves they knew something all along even though they couldn't. This "I knew it wouldn't last" mindset causes them to become overconfident in their ability to recognize patterns and predict the future, putting themselves at risk of making wrong decisions.

Hindsight bias is seen in many contexts, from relationships to investments to natural disasters. When an unfortunate event occurs abruptly, the natural human instinct is to find coherence and meaning in the happenstance in an effort to regain a sense of control and avoid going through the same problem again. People search high and low in their memory bank, cherry-picking details that match the event, connecting only the dots that make sense to them, thereby skewing their perception of the experience. As a result, they exaggerate the event's foreseeability. Although it's understandable for us to try to make sense of what happened, hindsight bias can cause us to have a negative, if not flawed, perception of our current or future relationships.

How To Avoid Hindsight Bias

People who are most likely to engage in hindsight bias are those who evaluate everything through the lens of cause and effect and who find comfort in thinking everything is predictable. "He did this because I did that first," we think. The "you sow what you reap" principle doesn't always apply in relationships, though.

For instance, if you are a patient and loving person, how could you have predicted your partner would cheat on you? Then, you start convincing yourself that people walk all over you because you're too nice to them, and, as a result, you change 180 degrees when you enter a new relationship, thinking it should work this time. The truth is, life per se can be hit and miss, and there's no way to tell what's ahead.

One way to deal with hindsight bias is to consider alternative scenarios to acquire a more realistic, balanced picture of the situation. A big part of hindsight is drawing a conclusion from selective memories. You can prevent all the second-guessing and revisioning by having a journal where you keep a regular record of the good, the bad, and the ugly memories so you can revisit the entries to ensure you're seeing an accurate reflection on a past event. The idea is similar to that of a serendipity journal .  Check out some tips to help you get started . This will help you learn from your mistakes and make better decisions moving forward. 

Read this next: Your Ultimate Guide To Grieving A Break-Up

Girl reflecting on a memory

IMAGES

  1. Hindsight Bias Meaning, Examples, and How to Avoid It

    hindsight bias in critical thinking

  2. What Is Cognitive Bias? Types & Examples

    hindsight bias in critical thinking

  3. 15 Hindsight Bias Examples (2024)

    hindsight bias in critical thinking

  4. 3 levels of hindsight bias

    hindsight bias in critical thinking

  5. Understanding the Key Aspects of Hindsight Bias With Examples

    hindsight bias in critical thinking

  6. What Is Hindsight Bias?

    hindsight bias in critical thinking

VIDEO

  1. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  2. Confirming you were right! #bias #confirmationbias #shorts

  3. Hindsight Bias #psychology #bias #hindsight #hindsightbias #psychologyvideos #psychologyfacts #psyc

  4. "Hindsight bias" can distort... #Shorts #Top5Lists

  5. Anti-Hindu bias in IAS Coaching

  6. David McRaney on Cognitive Bias and Decision-Making

COMMENTS

  1. What is Hindsight Bias: A Comprehensive Analysis

    Developing critical thinking skills and being open to alternative explanations can also help reduce the impact of hindsight bias. Continual self-reflection and questioning one's own beliefs and assumptions can help identify and decrease this type of cognitive bias in decision-making.

  2. Cognitive Bias Is the Loose Screw in Critical Thinking

    Hindsight bias is the inclination to see some events as more predictable than they are; also known as the "I knew it all along" reaction. Examples of this bias would be believing that you knew ...

  3. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  4. Hindsight Bias in Beliefs and Behaviors

    The hindsight bias involves the tendency people have to assume that they knew the outcome of an event after the outcome has already been determined. For example, after attending a baseball game, you might insist that you knew that the winning team was going to win beforehand. High school and college students often experience hindsight bias ...

  5. What Is Hindsight Bias?

    Hindsight bias is the tendency to perceive past events as more predictable than they actually were. Due to this, people think their judgment is better than it is. This can lead them to take unnecessary risks or judge others too harshly. Example: Hindsight bias. Football fans often criticize or question the actions of players or coaches in what ...

  6. Hindsight Bias: Understanding the Psychology Behind Our 20/20 ...

    "Thinking, Fast and Slow" by Daniel Kahneman — This best-selling book by a Nobel laureate in economics explores the many ways in which our minds can be led astray by cognitive biases ...

  7. Cognitive Bias Is the Loose Screw in Critical Thinking

    People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality. Cognitive biases are mental shortcuts people take in order to process ...

  8. Cognitive Biases + Growth Mindset (article)

    Cognitive bias refers to the errors in thinking or decision-making that occur as a result of our cognitive processes, including attention, perception, memory, and reasoning. Cognitive biases can change our perception of reality, with a potential to cause wrong judgments, beliefs, and decisions. By becoming more aware of our biases and ...

  9. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while ...

  10. How to Avoid Hindsight Bias in Decision-Making

    By applying these critical thinking strategies, you can avoid hindsight bias and become a more effective and rational Decision-Maker. Add your perspective Help others by sharing more (125 ...

  11. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  12. Perspective-taking and hindsight bias: When the target is oneself and

    Illustrative Research on Perspective-Taking and Hindsight Bias. Central to the issue of perspective-taking is the notion of egocentrism. It is the belief that others share one's own perspective (e.g., Epley et al., 2004; Nickerson, 1999).People tend to believe their behavior, knowledge, and beliefs are commonly shared (Mullen, 1983), readily accessible, and make an automatic inference about ...

  13. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  14. The Impact of Cognitive Biases on Professionals' Decision-Making: A

    Hindsight bias is a propensity to perceive events as being more predictable, ... -making (availability, selective perception, illusory correlation, conservatism, law of small numbers, regression bias, wishful thinking, ... A critical review using a systematic search strategy. Med. Decis. Mak. 35, ...

  15. Hindsight Bias

    For instance, saying to themselves, "That sounds like hindsight bias again", whenever they notice this style of thinking. Clarifying what was known and when. When someone judges themselves in hindsight (with knowledge that they gained after a critical choice, decision, or action), it is helpful to clarify what was known and when.

  16. A List of Common Cognitive Biases (With Examples)

    Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X. 2. Implicit biases are unconscious beliefs that lead ...

  17. Hindsight Bias

    Hindsight bias occurs when people feel that they "knew it all along," that is, when they believe that an event is more predictable after it becomes known than it was before it became known. Hindsight bias embodies any combination of three aspects: memory distortion, beliefs about events' objective likelihoods, or subjective beliefs about ...

  18. What is hindsight bias? How to recognize it and why it matters

    Hindsight bias is a thought pattern that convinces you that you've known a certain outcome all along. This can make processing trauma difficult, because of a belief that it could have been ...

  19. 12 Common Biases That Affect How We Make Everyday Decisions

    The hindsight bias is similar to the curse of knowledge in that once we have ... Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills ...

  20. Hindsight Bias: Definition & Examples

    Hindsight bias is a sneaky cognitive bias that can distort your thinking and decision-making. By being aware of its effects and taking steps to combat it, you can make more informed decisions and avoid overconfidence in your predictive abilities. One way to combat hindsight bias is to keep a journal of your predictions and the reasoning behind ...

  21. Hindsight Bias and Developing Theories of Mind

    Although hindsight bias (the "I knew it all along" phenomenon) has been documented in adults, its development has not been investigated. This is despite the fact that hindsight bias errors closely resemble the errors children make on theory of mind (ToM) tasks. Two main goals of the present work were to (a) create a battery of hindsight tasks for preschoolers, and (b) assess the relation ...

  22. Hindsight bias

    hindsight bias, the tendency, upon learning an outcome of an event—such as an experiment, a sporting event, a military decision, or a political election—to overestimate one's ability to have foreseen the outcome. Hindsight bias is colloquially known as the "I knew it all along phenomenon.". It is a type of confirmation bias. (Read ...

  23. Hindsight Bias

    Hindsight bias is defined as the belief that an event is more predictable after it becomes known than it was before it became known. For example, a voter might believe that after accepting the Democratic nomination for president in August 2008, Barak Obama's chances of winning the U.S. presidency was about 60%.

  24. Hindsight Bias: Understanding The Feeling & Using It Productively ...

    Hindsight bias is seen in many contexts, from relationships to investments to natural disasters. ... thinking it should work this time. The truth is, life per se can be hit and miss, and there's ...