U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Grad Med Educ
  • v.13(2); 2021 Apr

Defeating Unconscious Bias: The Role of a Structured, Reflective, and Interactive Workshop

Dotun ogunyemi.

Dotun Ogunyemi, MD, FACOG, MFM, is Chief Diversity Officer and Professor of Medical Education, Obstetrics & Gynecology, California University of Science and Medicine, and Designated Institutional Official and Associate Chief Medical Officer, Arrowhead Regional Medical Center

Associated Data

Unconscious or implicit biases are universal and detrimental to health care and the learning environment but can be corrected. Historical interventions used the Implicit Association Test (IAT), which may have limitations.

We determined the efficacy of an implicit bias training without using the IAT.

From April 2019 to June 2020, a 90-minute educational workshop was attended by students, residents, and faculty. The curriculum included an interactive unconscious biases presentation, videoclips using vignettes to demonstrate workplace impact of unconscious biases with strategies to counter, and reflective group discussions. The evaluation included pre- and postintervention surveys. Participants were shown images of 5 individuals and recorded first impressions regarding trustworthiness and presumed profession to unmask implicit bias.

Of approximately 273 participants, 181 were given the survey, of which 103 (57%) completed it with significant increases from pre- to postintervention assessments for perception scores (28.87 [SEM 0.585] vs 32.73 [0.576], P < .001) and knowledge scores (5.68 [0.191] vs 7.22 [0.157], P < .001). For a White male physician covered in tattoos, only 2% correctly identified him as a physician, and 60% felt he was untrustworthy. For a smiling Black female astronaut, only 13% correctly identified her as an astronaut. For a brooding White male serial killer, 50% found him trustworthy.

Conclusions

An interactive unconscious bias workshop, performed without the use of an IAT, was associated with increases in perceptions and knowledge regarding implicit biases. The findings also confirmed inaccurate first impression stereotypical assumptions based on ethnicity, outward appearances, couture, and media influences.

We determined if implicit bias training without using the Implicit Association Test (IAT) is feasible.

A brief interactive workshop without using IAT can increase knowledge and perceptions of implicit bias and introduce the principle of intersectionality.

Limitations

External generalizability was limited by selection and participation bias.

Bottom Line

A brief interactive implicit bias workshop intervention can be used to train residents, other learners, faculty, and coordinators in the medical education continuum.

Introduction

Unconscious or implicit biases are attitudes or stereotypes that arise from preformed mental associations, which influence our understanding, actions, and decisions in an unconscious manner. 1 Unconscious biases are universal and have adverse consequences for the workplace, health care, and the learning environment. 2 – 4 Studies show that clinicians' negative implicit bias correlated with poorer quality of care, inadequate clinician-patient communication, and health care disparities and inequities. 3 – 8 Unconscious biases adversely affect faculty recruitment and promotion, including the persistent underrepresentation of Black Americans and other minorities in medicine, further exacerbating racial health care disparities. 9 , 10 Unconscious bias has been shown to be malleable and correctable with training. 2 , 10 Consequently, strategies to mitigate unconscious bias are needed in medical education. Previously reported unconscious bias trainings have revealed that Implicit Association Tests (IAT) are ubiquitous. 10 Studies have shown that IATs may induce defensiveness triggering denial of bias and existence of health disparities. 11 Critics suggest that instead of reflecting authentic negative attitudes, IAT scores may stem from other associations such as victimization, maltreatment, and oppression. 11 , 12 Authors of the IATs have noted that the tool may not reflect actual biases or acts of discrimination related to identified preferences. 4 Subsequently, the objective of this study was to determine: (1) if a brief educational workshop can increase knowledge and perceptions regarding unconscious bias, and (2) show that inaccurate first impressions can be elicited without the IATs.

This was a retrospective study of an educational workshop presented from April 2019 to June 2020. The workshop was developed from the knowledge gained by the author on completing the Association of American Medical Colleges Healthcare Executive Diversity and Inclusion Certificate (provided as online supplementary data). Kern's 6-step approach for curriculum development was used. 12 The conceptual framework utilized was “situated learning-guided participation” in which didactic and interactive activities facilitate independent learning. 13

The 90-minute educational workshop included an interactive presentation on unconscious bias. To briefly demonstrate implicit bias, participants were rapidly shown images of 5 individuals in succession and they recorded their first impressions of the persons regarding trustworthiness and presumed profession. This workshop also taught intersectionality, which is a theoretical framework conceptualizing that multiple social categories (eg, race, gender, sexual orientation, poverty) intersect to reflect multiple interlocking systems of privilege and oppression at the social-structural level (eg, racism, sexism, heterosexism). 14

The workshop utilized video clips of situational vignettes to demonstrate the impact of unconscious bias. Participants reflected on experiences of unconscious bias and mitigating strategies in small groups ( Table 1 ). The workshop was presented at the 2019 CREOG & APGO Annual Meeting in New Orleans. Subsequently it was presented in multiple voluntary sessions to medical students, residents, and faculty in internal medicine, family medicine, psychiatry, and obstetrics and gynecology departments at California University of Science and Medicine and Arrowhead Regional Medical Center.

Agenda for the Unconscious Bias Reflective and Interactive Workshop

A survey consisting of 9 perception and 11 knowledge questions on implicit bias was assessed for clarity and reliability by content experts and repeat testing. The survey was completed pre- and posteducational workshop to assess short-term learning (provided as online supplementary data). The survey was not offered to the incoming class of 92 medical students because of time constraints of the orientation schedule.

Statistical analysis was performed using SPSS 21.0 (IBM Corp, Armonk, NY). Student's t tests were performed with calculation of 95% confidence interval and odds ratio with a P value of .05 as significant. The first impressions data was tabulated, and percentages of correct responses reported.

The study was approved by the Institutional Review Board of California University of Science and Medicine.

Of approximately 181 participants, 103 (57%) respondents completed the surveys, including 28 (36%) females, 49 (64%) males, and 26 with missing gender. Twenty-three (22%) had previously taken the IATs, while 24 (22%) had previous implicit bias training. There were 61 (59%) physician faculty, 24 (23%) residents, 4 (4%) program coordinators, and 2 (2%) students. Medical specialties included 33 (38%) obstetrics and gynecology, 33 (38%) family medicine, 9 (10%) internal medicine, and 11 (13%) psychiatry. Sixty-three (61%) participants attended workshops in San Bernardino, California, while 40 participated at the APGO conference.

The results of testing for first impressions revealed that for a White male physician community advocate covered in tattoos and dressed in jeans, 2% correctly identified him as a physician. For a smiling Black woman astronaut, 13% correctly identified her as an astronaut. Of a brooding White male serial killer, 50% found him trustworthy. For a Cameroonian attorney, many incorrectly assumed she was Maya Angelou, and thus labeled her a writer ( Table 2 ).

Participants' First Impressions Regarding Trustworthiness and Likely Profession of Images of 5 Individuals Shown in Rapid Succession

Note: Results are in percentages participants who completed the first impression surveys (N = 91).

There were significant increases from pre- to postintervention assessments for the total perception scores (28.87 [SEM 0.585] vs 32.73 [0.576], P < .001) and total knowledge scores (5.68 [0.191] vs 7.22 [0.157], P < .001). All 9 perception questions including only 4 of the 11 knowledge questions increased significantly after the intervention ( Table 3 ). Significant subgroup differences are reported as online supplementary data.

Preintervention and Postintervention Scores of the Unconscious Bias Workshop a

Abbreviation: NS, non-significant.

This study demonstrates that a 90-minute interactive workshop significantly increased perception and knowledge regarding unconscious bias. Implicit bias may contribute to health care disparities by influencing physician behavior resulting in differences in medical treatment along race, gender, or other characteristics. 1 , 15 Thus curricular activities allowing physicians to become aware of their biases may facilitate the provision of patient-centered care.

This intervention can be utilized for residents, other learners, faculty, and coordinators in the medical education continuum. Furthermore, a literature review of implicit bias training only revealed reports on medical students training with none noted on GME training. 16 – 24 This current study adds to the literature by reporting an educational workshop focused on all GME that detected biases in real time without a formal IAT.

In contrast to previous reports that utilized IATs, this study's participants recorded first impressions after brief exposures to images of real individuals with multiple identities that highlighted the principle of intersectionality. For example, a lesbian Black woman in African garb (4 oppressed identities) was not identified as a lawyer, while a young Black female astronaut (3 oppressed identities) was identified as an actress. A White male (2 privileged identities) serial killer was trusted by 50% and identified as a professor, while a tattooed and informally dressed White man (2 privileged and 2 oppressed identities) was not recognized as a doctor. These findings confirmed inaccurate first impression stereotypical assumptions based on ethnicity, outward appearances, couture, and media influences. These findings confirm that biases can be detected without relying on the use of a formal IAT and its limitations.

Limitations of this study included the likelihood of participation bias since approximately 57% of the participants completed the surveys. Selection bias may have occurred since participants self-selected. Ethnic data was not collected. Barriers to implementation include time to identify and train facilitators. Institutions and departments would have to prioritize implicit bias training and provide protected time for both faculty and residents. The workshop is relatively inexpensive, acceptable, and feasible with faculty time commitment as the major cost. The organization and planning of this program would require about 4 hours, and the workshop presentation would require approximately 2 hours to implement.

This study has demonstrated that a brief interactive workshop without using IAT can be implemented to increase knowledge and perceptions of unconscious bias.

Supplementary Material

Funding: The author reports no external funding source for this study.

Conflict of interest: The author declares no competing interests.

The abstract was presented at CREOG and APGO Annual Meeting, New Orleans, Louisiana, February 27–March 2, 2019.

Implicit Bias (Unconscious Bias): Definition & Examples

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

On This Page:

Implicit bias refers to the beliefs and attitudes that affect our understanding, actions and decisions in an unconscious way.

Take-home Messages

  • Implicit biases are unconscious attitudes and stereotypes that can manifest in the criminal justice system, workplace, school setting, and in the healthcare system.
  • Implicit bias is also known as unconscious bias or implicit social cognition.
  • There are many different examples of implicit biases, ranging from categories of race, gender, and sexuality.
  • These biases often arise from trying to find patterns and navigate the overwhelming stimuli in this complicated world. Culture, media, and upbringing can also contribute to the development of such biases.
  • Removing these biases is a challenge, especially because we often don’t even know they exist, but research reveals potential interventions and provides hope that levels of implicit biases in the United States are decreasing.

implicit bias

The term implicit bias was first coined in 1995 by psychologists Mahzarin Banaji and Anthony Greenwald, who argued that social behavior is largely influenced by unconscious associations and judgments (Greenwald & Banaji, 1995).

So, what is implicit bias?

Specifically, implicit bias refers to attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious way, making them difficult to control.

Since the mid-90s, psychologists have extensively researched implicit biases, revealing that, without even knowing it, we all possess our own implicit biases.

System 1 and System 2 Thinking

Kahneman (2011) distinguishes between two types of thinking: system 1 and system 2.
  • System 1 is the brain’s fast, emotional, unconscious thinking mode. This type of thinking requires little effort, but it is often error-prone. Most everyday activities (like driving, talking, cleaning, etc.) heavily use the type 1 system.
  • System 2 is slow, logical, effortful, conscious thought, where reason dominates.

Daniel Kahnemans Systems

Implicit Bias vs. Explicit Bias

What is meant by implicit bias.

Implicit bias (unconscious bias) refers to attitudes and beliefs outside our conscious awareness and control. Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006).

An implicit bias may counter a person’s conscious beliefs without realizing it. For example, it is possible to express explicit liking of a certain social group or approval of a certain action while simultaneously being biased against that group or action on an unconscious level.

Therefore, implicit and explicit biases might differ for the same person.

It is important to understand that implicit biases can become explicit biases. This occurs when you become consciously aware of your prejudices and beliefs. They surface in your mind, leading you to choose whether to act on or against them.

What is meant by explicit bias?

Explicit biases are biases we are aware of on a conscious level (for example, feeling threatened by another group and delivering hate speech as a result). They are an example of system 2 thinking.

It is also possible that your implicit and explicit biases differ from your neighbor, friend, or family member. Many factors can control how such biases are developed.

What Are the Implications of Unconscious Bias?

Implicit biases become evident in many different domains of society. On an interpersonal level, they can manifest in simply daily interactions.

This occurs when certain actions (or microaggressions) make others feel uncomfortable or aware of the specific prejudices you may hold against them.

Implicit Prejudice

Implicit prejudice is the automatic, unconscious attitudes or stereotypes that influence our understanding, actions, and decisions. Unlike explicit prejudice, which is consciously controlled, implicit prejudice can occur even in individuals who consciously reject prejudice and strive for impartiality.

Unconscious racial stereotypes are a major example of implicit prejudice. In other words, having an automatic preference for one race over another without being aware of this bias.

This bias can manifest in small interpersonal interactions and has broader implications in society’s legal system and many other important sectors.

Examples may include holding an implicit stereotype that associates Black individuals as violent. As a result, you may cross the street at night when you see a Black man walking in your direction without even realizing why you are crossing the street.

The action taken here is an example of a microaggression. A microaggression is a subtle, automatic, and often nonverbal that communicates hostile, derogatory, or negative prejudicial slights and insults toward any group (Pierce, 1970). Crossing the street communicates an implicit prejudice, even though you might not even be aware.

Another example of an implicit racial bias is if a Latino student is complimented by a teacher for speaking perfect English, but he is a native English speaker. Here, the teacher assumed that English would not be his first language simply because he is Latino.

Gender Stereotypes

Gender biases are another common form of implicit bias. Gender biases are the ways in which we judge men and women based on traditional feminine and masculine assigned traits.

For example, a greater assignment of fame to male than female names (Banaji & Greenwald, 1995) reveals a subconscious bias that holds men at a higher level than their female counterparts. Whether you voice the opinion that men are more famous than women is independent of this implicit gender bias.

Another common implicit gender bias regards women in STEM (science, technology, engineering, and mathematics).

In school, girls are more likely to be associated with language over math. In contrast, males are more likely to be associated with math over language (Steffens & Jelenec, 2011), revealing clear gender-related implicit biases that can ultimately go so far as to dictate future career paths.

Even if you outwardly say men and women are equally good at math, it is possible you subconsciously associate math more strongly with men without even being aware of this association.

Health Care

Healthcare is another setting where implicit biases are very present. Racial and ethnic minorities and women are subject to less accurate diagnoses, curtailed treatment options, less pain management, and worse clinical outcomes (Chapman, Kaatz, & Carnes, 2013).

Additionally, Black children are often not treated as children or given the same compassion or level of care provided for White children (Johnson et al., 2017).

It becomes evident that implicit biases infiltrate the most common sectors of society, making it all the more important to question how we can remove these biases.

LGBTQ+ Community Bias

Similar to implicit racial and gender biases, individuals may hold implicit biases against members of the LGBTQ+ community. Again, that does not necessarily mean that these opinions are voiced outwardly or even consciously recognized by the beholder, for that matter.

Rather, these biases are unconscious. A really simple example could be asking a female friend if she has a boyfriend, assuming her sexuality and that heterosexuality is the norm or default.

Instead, you could ask your friend if she is seeing someone in this specific situation. Several other forms of implicit biases fall into categories ranging from weight to ethnicity to ability that come into play in our everyday lives.

Legal System

Both law enforcement and the legal system shed light on implicit biases. An example of implicit bias functioning in law enforcement is the shooter bias – the tendency among the police to shoot Black civilians more often than White civilians, even when they are unarmed (Mekawi & Bresin, 2015).

This bias has been repeatedly tested in the laboratory setting, revealing an implicit bias against Black individuals. Blacks are also disproportionately arrested and given harsher sentences, and Black juveniles are tried as adults more often than their White peers.

Black boys are also seen as less childlike, less innocent, more culpable, more responsible for their actions, and as being more appropriate targets for police violence (Goff, 2014).

Together, these unconscious stereotypes, which are not rooted in truth, form an array of implicit biases that are extremely dangerous and utterly unjust.

Implicit biases are also visible in the workplace. One experiment that tracked the success of White and Black job applicants found that stereotypically White received 50% more callbacks than stereotypically Black names, regardless of the industry or occupation (Bertrand & Mullainathan, 2004).

This reveals another form of implicit bias: the hiring bias – Anglicized‐named applicants receiving more favorable pre‐interview impressions than other ethnic‐named applicants (Watson, Appiah, & Thornton, 2011).

We’re susceptible to bias because of these tendencies:

We tend to seek out patterns

A key reason we develop such biases is that our brains have a natural tendency to look for patterns and associations to make sense of a very complicated world.

Research shows that even before kindergarten, children already use their group membership (e.g., racial group, gender group, age group, etc.) to guide inferences about psychological and behavioral traits.

At such a young age, they have already begun seeking patterns and recognizing what distinguishes them from other groups (Baron, Dunham, Banaji, & Carey, 2014).

And not only do children recognize what sets them apart from other groups, they believe “what is similar to me is good, and what is different from me is bad” (Cameron, Alvarez, Ruble, & Fuligni, 2001).

Children aren’t just noticing how similar or dissimilar they are to others; dissimilar people are actively disliked (Aboud, 1988).

Recognizing what sets you apart from others and then forming negative opinions about those outgroups (a social group with which an individual does not identify) contributes to the development of implicit biases.

We like to take shortcuts

Another explanation is that the development of these biases is a result of the brain’s tendency to try to simplify the world.

Mental shortcuts make it faster and easier for the brain to sort through all of the overwhelming data and stimuli we are met with every second of the day. And we take mental shortcuts all the time. Rules of thumb, educated guesses, and using “common sense” are all forms of mental shortcuts.

Implicit bias is a result of taking one of these cognitive shortcuts inaccurately (Rynders, 2019). As a result, we incorrectly rely on these unconscious stereotypes to provide guidance in a very complex world.

And especially when we are under high levels of stress, we are more likely to rely on these biases than to examine all of the relevant, surrounding information (Wigboldus, Sherman, Franzese, & Knippenberg, 2004).

Social and Cultural influences

Influences from media, culture, and your individual upbringing can also contribute to the rise of implicit associations that people form about the members of social outgroups. Media has become increasingly accessible, and while that has many benefits, it can also lead to implicit biases.

The way TV portrays individuals or the language journal articles use can ingrain specific biases in our minds.

For example, they can lead us to associate Black people with criminals or females as nurses or teachers. The way you are raised can also play a huge role. One research study found that parental racial attitudes can influence children’s implicit prejudice (Sinclair, Dunn, & Lowery, 2005).

And parents are not the only figures who can influence such attitudes. Siblings, the school setting, and the culture in which you grow up can also shape your explicit beliefs and implicit biases.

Implicit Attitude Test (IAT)

What sets implicit biases apart from other forms is that they are subconscious – we don’t know if we have them.

However, researchers have developed the Implicit Association Test (IAT) tool to help reveal such biases.

The Implicit Attitude Test (IAT) is a psychological assessment to measure an individual’s unconscious biases and associations. The test measures how quickly a person associates concepts or groups (such as race or gender) with positive or negative attributes, revealing biases that may not be consciously acknowledged.

The IAT requires participants to categorize negative and positive words together with either images or words (Greenwald, McGhee, & Schwartz, 1998).

Tests are taken online and must be performed as quickly as possible, the faster you categorize certain words or faces of a category, the stronger the bias you hold about that category.

For example, the Race IAT requires participants to categorize White faces and Black faces and negative and positive words. The relative speed of association of black faces with negative words is used as an indication of the level of anti-black bias.

Kahneman

Professor Brian Nosek and colleagues tested more than 700,000 subjects. They found that more than 70% of White subjects more easily associated White faces with positive words and Black faces with negative words, concluding that this was evidence of implicit racial bias (Nosek, Greenwald, & Banaji, 2007).

Outside of lab testing, it is very difficult to know if we do, in fact, possess these biases. The fact that they are so hard to detect is in the very nature of this form of bias, making them very dangerous in various real-world settings.

How to Reduce Implicit Bias

Because of the harmful nature of implicit biases, it is critical to examine how we can begin to remove them.

Practicing mindfulness is one potential way, as it reduces the stress and cognitive load that otherwise leads to relying on such biases.

A 2016 study found that brief mediation decreased unconscious bias against black people and elderly people (Lueke & Gibson, 2016), providing initial insight into the usefulness of this approach and paving the way for future research on this intervention.

Adjust your perspective

Another method is perspective-taking – looking beyond your own point of view so that you can consider how someone else may think or feel about something.

Researcher Belinda Gutierrez implemented a videogame called “Fair Play,” in which players assume the role of a Black graduate student named Jamal Davis.

As Jamal, players experience subtle race bias while completing “quests” to obtain a science degree.

Gutierrez hypothesized that participants who were randomly assigned to play the game would have greater empathy for Jamal and lower implicit race bias than participants randomized to read narrative text (not perspective-taking) describing Jamal’s experience (Gutierrez, 2014), and her hypothesis was supported, illustrating the benefits of perspective taking in increasing empathy towards outgroup members.

Specific implicit bias training has been incorporated in different educational and law enforcement settings. Research has found that diversity training to overcome biases against women in STEM improved with men (Jackson, Hillard, & Schneider, 2014).

Training programs designed to target and help overcome implicit biases may also be beneficial for police officers (Plant & Peruche, 2005), but there is not enough conclusive evidence to completely support this claim. One pitfall of such training is a potential rebound effect.

Actively trying to inhibit stereotyping actually results in the bias eventually increasing more so than if it had not been initially suppressed in the first place (Macrae, Bodenhausen, Milne, & Jetten, 1994). This is very similar to the white bear problem that is discussed in many psychology curricula.

This concept refers to the psychological process whereby deliberate attempts to suppress certain thoughts make them more likely to surface (Wegner & Schneider, 2003).

Education is crucial. Understanding what implicit biases are, how they can arise how, and how to recognize them in yourself and others are all incredibly important in working towards overcoming such biases.

Learning about other cultures or outgroups and what language and behaviors may come off as offensive is critical as well. Education is a powerful tool that can extend beyond the classroom through books, media, and conversations.

On the bright side, implicit biases in the United States have been improving.

From 2007 to 2016, implicit biases have changed towards neutrality for sexual orientation, race, and skin-tone attitudes (Charlesworth & Banaji, 2019), demonstrating that it is possible to overcome these biases.

Books for further reading

As mentioned, education is extremely important. Here are a few places to get started in learning more about implicit biases:

  • Biased: Uncovering the Hidden Prejudice That Shapes What We See Think and Do by Jennifer Eberhardt
  • Blindspot by Anthony Greenwald and Mahzarin Banaji
  • Implicit Racial Bias Across the Law by Justin Levinson and Robert Smith

Keywords and Terminology

To find materials on implicit bias and related topics, search databases and other tools using the following keywords:

Is unconscious bias the same as implicit bias?

Yes, unconscious bias is the same as implicit bias. Both terms refer to the biases we carry without awareness or conscious control, which can affect our attitudes and actions toward others.

In what ways can implicit bias impact our interactions with others?

Implicit bias can impact our interactions with others by unconsciously influencing our attitudes, behaviors, and decisions. This can lead to stereotyping, prejudice, and discrimination, even when we consciously believe in equality and fairness.

It can affect various domains of life, including workplace dynamics, healthcare provision, law enforcement, and everyday social interactions.

What are some implicit bias examples?

Some examples of implicit biases include assuming a woman is less competent than a man in a leadership role, associating certain ethnicities with criminal behavior, or believing that older people are not technologically savvy.

Other examples include perceiving individuals with disabilities as less capable or assuming that someone who is overweight is lazy or unmotivated.

Aboud, F. E. (1988). Children and prejudice . B. Blackwell.

Banaji, M. R., & Greenwald, A. G. (1995). Implicit gender stereotyping in judgments of fame. Journal of Personality and Social Psychology , 68 (2), 181.

Baron, A. S., Dunham, Y., Banaji, M., & Carey, S. (2014). Constraints on the acquisition of social category concepts. Journal of Cognition and Development , 15 (2), 238-268.

Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American economic review , 94 (4), 991-1013.

Cameron, J. A., Alvarez, J. M., Ruble, D. N., & Fuligni, A. J. (2001). Children’s lay theories about ingroups and outgroups: Reconceptualizing research on prejudice. Personality and Social Psychology Review , 5 (2), 118-128.

Chapman, E. N., Kaatz, A., & Carnes, M. (2013). Physicians and implicit bias: how doctors may unwittingly perpetuate health care disparities. Journal of general internal medicine , 28 (11), 1504-1510.

Charlesworth, T. E., & Banaji, M. R. (2019). Patterns of implicit and explicit attitudes: I. Long-term change and stability from 2007 to 2016. Psychological science , 30(2), 174-192.

Goff, P. A., Jackson, M. C., Di Leone, B. A. L., Culotta, C. M., & DiTomasso, N. A. (2014). The essence of innocence: consequences of dehumanizing Black children. Journal of personality and socialpsychology,106(4), 526.

Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychological review, 102(1), 4.

Greenwald, A. G., McGhee, D. E., & Schwartz, J. L. (1998). Measuring individual differences in implicit cognition: the implicit association test. Journal of personality and social psychology , 74(6), 1464.

Greenwald, A. G., & Krieger, L. H. (2006). Implicit bias: Scientific foundations. California Law Review , 94 (4), 945-967.

Gutierrez, B., Kaatz, A., Chu, S., Ramirez, D., Samson-Samuel, C., & Carnes, M. (2014). “Fair Play”: a videogame designed to address implicit race bias through active perspective taking. Games for health journal , 3 (6), 371-378.

Jackson, S. M., Hillard, A. L., & Schneider, T. R. (2014). Using implicit bias training to improve attitudes toward women in STEM. Social Psychology of Education , 17 (3), 419-438.

Johnson, T. J., Winger, D. G., Hickey, R. W., Switzer, G. E., Miller, E., Nguyen, M. B., … & Hausmann, L. R. (2017). Comparison of physician implicit racial bias toward adults versus children. Academic pediatrics , 17 (2), 120-126.

Kahneman, D. (2011). Thinking, fast and slow . Macmillan.

Lueke, A., & Gibson, B. (2016). Brief mindfulness meditation reduces discrimination. Psychology of Consciousness: Theory, Research, and Practice , 3 (1), 34.

Macrae, C. N., Bodenhausen, G. V., Milne, A. B., & Jetten, J. (1994). Out of mind but back in sight: Stereotypes on the rebound. Journal of personality and social psychology , 67 (5), 808.

Mekawi, Y., & Bresin, K. (2015). Is the evidence from racial bias shooting task studies a smoking gun? Results from a meta-analysis. Journal of Experimental Social Psychology , 61 , 120-130.

Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior , 4 , 265-292.

Pierce, C. (1970). Offensive mechanisms. The black seventies , 265-282.

Plant, E. A., & Peruche, B. M. (2005). The consequences of race for police officers’ responses to criminal suspects. Psychological Science , 16 (3), 180-183.

Rynders, D. (2019). Battling Implicit Bias in the IDEA to Advocate for African American Students with Disabilities. Touro L. Rev. , 35 , 461.

Sinclair, S., Dunn, E., & Lowery, B. (2005). The relationship between parental racial attitudes and children’s implicit prejudice. Journal of Experimental Social Psychology , 41 (3), 283-289.

Steffens, M. C., & Jelenec, P. (2011). Separating implicit gender stereotypes regarding math and language: Implicit ability stereotypes are self-serving for boys and men, but not for girls and women. Sex Roles , 64(5-6), 324-335.

Watson, S., Appiah, O., & Thornton, C. G. (2011). The effect of name on pre‐interview impressions and occupational stereotypes: the case of black sales job applicants. Journal of Applied Social Psychology , 41 (10), 2405-2420.

Wegner, D. M., & Schneider, D. J. (2003). The white bear story. Psychological Inquiry , 14 (3-4), 326-329.

Wigboldus, D. H., Sherman, J. W., Franzese, H. L., & Knippenberg, A. V. (2004). Capacity and comprehension: Spontaneous stereotyping under cognitive load. Social Cognition , 22 (3), 292-309.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes a transcript
  • Understanding Your Racial Biases With John Dovidio, Ph.D., Yale University From the American Psychological Association11:09 minutes; includes a transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American Journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Robert Evans Wilson Jr.

Cognitive Bias Is the Loose Screw in Critical Thinking

Recognizing your biases enhances understanding and communication..

Posted May 17, 2021 | Reviewed by Jessica Schrader

  • People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality.
  • Cognitive biases are mental shortcuts people take in order to process the mass of information they receive daily.
  • Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias.

When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager , some of my friends began to smoke; I wanted to smoke too, but my parents forbid it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.

When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90% of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors: 11 of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I. I started smoking without concern because I had fallen prey to an authority bias , which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.

It's Likely You're Unaware of These Habits

Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you replied "yes” to any of these, then you may be guilty of relying on a cognitive bias.

In my last post, I wrote about the importance of critical thinking, and how in today’s information age, no one has an excuse for living in ignorance. Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all culpable of leaning on these mental crutches, even though we don’t do it intentionally.

What Are Cognitive Biases?

The Cambridge English Dictionary defines cognitive bias as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.

PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.

PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”

And, according to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.

In brief, a cognitive bias is a shortcut to thinking. And, it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It is simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they are told. Nevertheless, as understandable as depending on biases may be, it is still a severe deterrent to critical thinking.

Here's What to Watch Out For

Wikipedia lists 197 different cognitive biases. I am going to share with you a few of the more common ones so that in the future, you will be aware of the ones you may be using.

Confirmation bias is when you prefer to attend media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress . On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain, which will enable you to think more creatively (see my post: Surprise: Creativity Is a Skill, Not a Gift! ).

Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my post: Sometimes You Have to Rip the Cover Off ). Similar to anchoring is the halo effect , which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality . For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.

critical thinking and unconscious bias

Hindsight bias is the inclination to see some events as more predictable than they are; also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.

Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it is vulnerable to revision when you receive new information.

Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke down, but your colleague failed to get a promotion because of incompetence.

False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.

Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that tend to run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.

Bandwagon effect, also known as herd mentality or groupthink , is the propensity to accept beliefs or values because many other people also hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people will not think through an opinion and will assume it is correct because so many others agree with it.

Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would for anyone else. This is especially true in medicine where experts are frequently seen as infallible. An example would be an advertiser showing a doctor, wearing a lab coat, touting their product.

Negativity bias is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias is not as necessary (see my post Fear: Lifesaver or Manipulator ).

Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.

Understand More and Communicate Better

Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.

Source: Cognitive Bias Codex by John Manoogian III/Wikimedia Commons

Robert Wilson is a writer and humorist based in Atlanta, Georgia.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

brand logo

Taking steps to recognize and correct unconscious assumptions toward groups can promote health equity.

JENNIFER EDGOOSE, MD, MPH, MICHELLE QUIOGUE, MD, FAAFP, AND KARTIK SIDHAR, MD

Fam Pract Manag. 2019;26(4):29-33

Author disclosures: no relevant financial affiliations disclosed.

critical thinking and unconscious bias

Jamie is a 38-year-old woman and the attending physician on a busy inpatient teaching service. On rounds, she notices several patients tending to look at the male medical student when asking a question and seeming to disregard her. Alex is a 55-year-old black man who has a history of diabetic polyneuropathy with significant neuropathic pain. His last A1C was 7.8. He reports worsening lower extremity pain and is frustrated that, despite his bringing this up repeatedly to different clinicians, no one has addressed it. Alex has been on gabapentin 100 mg before bed for 18 months without change, and his physicians haven't increased or changed his medication to help with pain relief.

Alisha is a 27-year-old Asian family medicine resident who overhears labor and delivery nurses and the attending complain that Indian women are resistant to cervical exams.

These scenarios reflect the unconscious assumptions that pervade our everyday lives, not only as practicing clinicians but also as private citizens. Some of Jamie's patients assume the male member of the team is the attending physician. Alex's physicians perceive him to be a “drug-seeking” patient and miss opportunities to improve his care. Alisha is exposed to stereotypes about a particular ethnic group.

Although assumptions like these may not be directly ill-intentioned, they can have serious consequences. In medical practice, these unconscious beliefs and stereotypes influence medical decision-making. In the classic Institute of Medicine report “Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care,” the authors concluded that “bias, stereotyping, and clinical uncertainty on the part of health care providers may contribute to racial and ethnic disparities in health care” often despite providers' best intentions. 1 For example, studies show that discrimination and bias at both the individual and institutional levels contribute to shocking disparities for African-American patients in terms of receiving certain procedures less often or experiencing much higher infant mortality rates when compared with non-Hispanic whites. 2 , 3 As racial and ethnic diversity increases across our nation, it is imperative that we as physicians intentionally confront and find ways to mitigate our biases.

Implicit bias is the unconscious collection of stereotypes and attitudes that we develop toward certain groups of people, which can affect our patient relationships and care decisions.

You can overcome implicit bias by first discovering your blind spots and then actively working to dismiss stereotypes and attitudes that affect your interactions.

While individual action is helpful, organizations and institutions must also work to eliminate systemic problems.

DEFINING AND REDUCING IMPLICIT BIAS

For the last 30 years, science has demonstrated that automatic cognitive processes shape human behavior, beliefs, and attitudes. Implicit or unconscious bias derives from our ability to rapidly find patterns in small bits of information. Some of these patterns emerge from positive or negative attitudes and stereotypes that we develop about certain groups of people and form outside our own consciousness from a very young age. Although such cognitive processes help us efficiently sort and filter our perceptions, these reflexive biases also promote inconsistent decision making and, at worst, systematic errors in judgment.

Cognitive processes lead us to associate unconscious attributes with social identities. The literature explores how this influences our views on race, ethnicity, age, gender, sexual orientation, and weight, and studies show many people are biased in favor of people who are white, young, male, heterosexual, and thin. 4 Unconsciously, we not only learn to associate certain attributes with certain social groupings (e.g., men with strength, women with nurturing) but also develop preferential ranking of such groups (e.g., preference for whites over blacks). This unconscious grouping and ranking takes root early in development and is shaped by many outside factors such as media messages, institutional policies, and family beliefs. Studies show that health care professionals have the same level of implicit bias as the general population and that higher levels are associated with lower quality care. 5 Providers with higher levels of bias are more likely to demonstrate unequal treatment recommendations, disparities in pain management, and even lack of empathy toward minority patients. 6 In addition, stressful, time-pressured, and overloaded clinical practices can actually exacerbate unconscious negative attitudes. Although the potential impact of our biases can feel overwhelming, research demonstrates that these biases are malleable and can be overcome by conscious mitigation strategies. 7

We recommend three overarching strategies to mitigate implicit bias – educate, expose, and approach – which we will discuss in greater detail. We have further broken down these strategies into eight evidence-based tactics you can incorporate into any quality improvement project, diagnostic dilemma, or new patient encounter. Together, these eight tactics spell out the mnemonic IMPLICIT. (See “ Strategies to combat our implicit biases .”)

When we fail to learn about our blind spots, we miss opportunities to avoid harm. Educating ourselves about the reflexive cognitive processes that unconsciously affect our clinical decisions is the first step. The following tactics can help:

Introspection . It is not enough to just acknowledge that implicit bias exists. As clinicians, we must directly confront and explore our own personal implicit biases. As the writer Anais Nin is often credited with saying, “We don't see things as they are, we see them as we are.” To shed light on your potential blind spots and unconscious “sorting protocols,” we encourage you to take one or more implicit association tests . Discovering a moderate to strong bias in favor of or against certain social identities can help you begin this critical step in self exploration and understanding. 8 You can also complete this activity with your clinic staff and fellow physicians to uncover implicit biases as a group and set the stage for addressing them. For instance, many of us may be surprised to learn after taking an implicit association test that we follow the typical bias of associating males with science — an awareness that may explain why the patient in our first case example addressed questions to the male medical student instead of the female attending.

Mindfulness .It should come as no surprise that we are more likely to use cognitive shortcuts inappropriately when we are under pressure. Evidence suggests that increasing mindfulness improves our coping ability and modifies biological reactions that influence attention, emotional regulation, and habit formation. 9 There are many ways to increase mindfulness, including meditation, yoga, or listening to inspirational texts. In one study, individuals who listened to a 10-minute meditative audiotape that focused them and made them more aware of their sensations and thoughts in a nonjudgmental way caused them to rely less on instinct and show less implicit bias against black people and the aged. 10

It is also helpful to expose ourselves to counter-stereotypes and to focus on the unique individuals we interact with. Similarity bias is the tendency to favor ourselves and those like us. When our brains label someone as being within our same group, we empathize better and use our actions, words, and body language to signal this relatedness. Experience bias can lead us to overestimate how much others see things the same way we do, to believe that we are less vulnerable to bias than others, and to assume that our intentions are clear and obvious to others. Gaining exposure to other groups and ways of thinking can mitigate both of these types of bias. The following tactics can help:

Perspective-taking . This tactic involves taking the first-person perspective of a member of a stereotyped group, which can increase psychological closeness to that group. 8 Reading novels, watching documentaries, and listening to podcasts are accessible ways to reach beyond our comfort zone. To authentically perceive another person's perspective, however, you should engage in positive interactions with stereotyped group members in real life. Increased face-to-face contact with people who seem different from you on the surface undermines implicit bias.

Learn to slow down . To recognize our reflexive biases, we must pause and think. For example, the next time you interact with someone in a stereotyped group or observe societal stereotyping, such as through the media, recognize what responses are based on stereotypes, label those responses as stereotypical, and reflect on why the responses occurred. You might then consider how the biased response could be avoided in the future and replace it with an unbiased response. The physician treating Alex in the introduction could use this technique by slowing down and reassessing his medical care. By acknowledging the potential for bias, the physician may recognize that safe options remain for managing Alex's neuropathic pain.

Additionally, research strongly supports the use of counter-stereotypic imaging to replace automatic responses. 11 For example, when seeking to contradict a prevailing stereotype, substitute highly defined images, which can be abstract (e.g., modern Native Americans), famous (e.g., minority celebrities like Oprah Winfrey or Lin-Manuel Miranda), or personal (e.g., your child's teacher). As positive exemplars become more salient in your mind, they become cognitively accessible and challenge your stereotypic biases.

Individuation . This tactic relies on gathering specific information about the person interacting with you to prevent group-based stereotypic inferences. Family physicians are trained to build and maintain relationships with each individual patient under their care. Our own social identities intersect with multiple social groupings, for example, related to sexual orientation, ethnicity, and gender. Within these multiplicities, we can find shared identities that bring us closer to people, including shared experiences (e.g., parenting), common interests (e.g., sports teams), or mutual purpose (e.g., surviving cancer). Individuation could have helped the health care workers in Alisha's labor and delivery unit to avoid making judgments based on stereotypes. We can use this tactic to help inform clinical decisions by using what we know about a person's specific, individual, and unique attributes. 11

Like any habit, it is difficult to change biased behaviors with a “one shot” educational approach or awareness campaign. Taking a systematic approach at both the individual and institutional levels, and incorporating a continuous process of improvement, practice, and reflection, is critical to improving health equity.

Check your messaging . Using very specific messages designed to create a more inclusive environment and mitigate implicit bias can make a real difference. As opposed to claiming “we don't see color” or using other colorblind messaging, statements that welcome and embrace multiculturalism can have more success at decreasing racial bias.

Institutionalize fairness . Organizations have a responsibility to support a culture of diversity and inclusion because individual action is not enough to deconstruct systemic inequities. To overcome implicit bias throughout an organization, consider implementing an equity lens – a checklist that helps you consider your blind spots and biases and assures that great ideas and interventions are not only effective but also equitable (an example is included in the table above ). Another example would be to find opportunities to display images in your clinic's waiting room that counter stereotypes. You could also survey your institution to make sure it is embracing multicultural (and not colorblind) messaging.

Take two . Resisting implicit bias is lifelong work. The strategies introduced here require constant revision and reflection as you work toward cultural humility. Examining your own assumptions is just a starting point. Talking about implicit bias can trigger conflict, doubt, fear, and defensiveness. It can feel threatening to acknowledge that you participate in and benefit from systems that work better for some than others. This kind of work can mean taking a close look at the relationships you have and the institutions of which you are a part.

MOVING FORWARD

Education, exposure, and a systematic approach to understanding implicit bias may bring us closer to our aspirational goal to care for all our patients in the best possible way and move us toward a path of achieving health equity throughout the communities we serve. The mnemonic IMPLICIT can help us to remember the eight tactics we all need to practice. While disparities in social determinants of health are often beyond the control of an individual physician, we can still lead the fight for health equity for our own patients, both from within and outside the walls of health care. With our specialty-defining goal of getting to know each patient as a unique individual in the context of his or her community, family physicians are well suited to lead inclusively by being humble, respecting the dignity of each person, and expressing appreciation for how hard everyone works to overcome bias.

Smedley BD, Stith AY, Nelson AR, eds Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care . Washington, DC: Institute of Medicine, National Academy Press; 2003.

Hannan EL, van Ryn M, Burke J, et al.; Access to coronary artery bypass surgery by race/ethnicity and gender among patients who are appropriate for surgery. Med Care . 1999;37(1):68-77.

Infant mortality and African Americans. U.S Department of Health and Human Services Office of Minority Health website. https://minorityhealth.hhs.gov/omh/browse.aspx?lvl=4&lvlid=23 . Updated Nov. 9, 2017. Accessed June 10, 2019.

Nosek BA, Smyth FL, Hansen JJ, et al.; Pervasiveness and correlates of implicit attitudes and stereotypes. Eur Rev Soc Psychol . 2007;18(1):36-88.

FitzGerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics . 2017;18(1):19.

Maina IW, Belton TD, Ginzberg S, Singh A, Johnson TJ. A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test. Soc Sci Med . 2018;199:219-229.

Charlesworth TES, Banaji MR. Patterns of implicit and explicit attitudes: I. long-term change and stability from 2007 to 2016. Psychol Sci . 2019;30(2):174-192.

Sukhera J, Wodzinski M, Teunissen PW, Lingard L, Watling C. Striving while accepting: exploring the relationship between identity and implicit bias recognition and management. Acad Med . 2018;93(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 57th Annual Research in Medical Education Sessions):S82-S88.

Burgess DJ, Beach MC, Saha S. Mindfulness practice: A promising approach to reducing the effects of clinician implicit bias on patients. Patient Educ Couns . 2017;100(2):372-376.

Lueke A, Gibson B. Mindfulness meditation reduces implicit age and race bias: the role of reduced automaticity of responding. Soc Psychol Personal Sci . 2015;6(3):284-291.

Devine PG, Forscher PS, Austin AJ, Cox WTL. Long-term reduction in implicit race bias: a prejudice habit-breaking intervention. J Exp Soc Psychol . 2012;48(6):1267-1278.

Continue Reading

critical thinking and unconscious bias

More in FPM

More in pubmed.

Copyright © 2019 by the American Academy of Family Physicians.

This content is owned by the AAFP. A person viewing it online may make one printout of the material and may use that printout only for his or her personal, non-commercial reference. This material may not otherwise be downloaded, copied, printed, stored, transmitted or reproduced in any medium, whether now known or later invented, except as authorized in writing by the AAFP.  See permissions  for copyright questions and/or permission requests.

Copyright © 2024 American Academy of Family Physicians. All Rights Reserved.

2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Learning objectives.

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

Connections

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Confirmation bias.

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk cost fallacy.

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Think Like a Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Authors: Nathan Smith
  • Publisher/website: OpenStax
  • Book title: Introduction to Philosophy
  • Publication date: Jun 15, 2022
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Section URL: https://openstax.org/books/introduction-philosophy/pages/2-2-overcoming-cognitive-biases-and-engaging-in-critical-reflection

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

  • Climate Change
  • Departments & Research Centres
  • Behavioural economics
  • Corporate regulation
  • Digital disruption
  • Experimental economics
  • Health economics
  • Labour market
  • Organisational performance
  • Superannuation
  • Sustainability

Your Search Terms:

How unconscious bias shapes your thinking (and what you can do about it).

How unconscious bias shapes your thinking (and what you can do about it)

Photo: iStock

Most Australians believe in values such as fairness and equality – but these can be harder to act on than you would think.

All applicants for permanent visas in Australia sign off on values that include “freedom and dignity of the individual”, “equality of men and women and a spirit of egalitarianism that embraces mutual respect, tolerance, fair play and compassion for those in need”, and “equality of opportunity for individuals, regardless of their race, religion or ethnic background”.

Yet women are structurally paid less than men, research shows that people with Chinese names send in 68 per cent more CVs to land a job interview compared to Anglo-Saxon applicants (followed by people with Middle Eastern names at 64 per cent), while Aboriginal and LGTBIQ community members are at least twice as likely to end their lives than people in the wider community.

To better understand these statistics we need to talk about “unconscious biases”.

Snap judgments

Unconscious biases are thought patterns; mental shortcuts. Everybody has them. We learn these tendencies over our lifetime because they help us.

When walking home from the train on a dark rainy night, our past experiences culminate when we judge the appearance of that approaching stranger in a snap instant. Can we trust them? Did you think of the stranger being a man?

This is a case in point. We can do a complex activity like riding a bike, without consciously thinking about it. In a very similar way, biases help us navigate a complex social world.

Unfortunately, biases also have negative effects. We make snap judgments about others all the time: on the street, online, or when interviewing for a job. We use stereotypes to judge people from other groups.

If you feel uncomfortable reading this, you are experiencing what it feels like to be confronted with your own biases. Don’t run away – embrace them and commit to keeping them in check.

Dr Tim Soutphommasane, Australia’s Race Discrimination Commissioner, speaks about how Asians are generally seen as inoffensive, diligent, and productive.

Yet these traits are too easily interpreted as passivity, acquiescence and subservience, for example when promotions time comes around. It is also too easy to only socialise with people that are similar to us – those with the same interests, language, and problems.

Take Facebook: who does it recommend as your possible friends? Which pages does it suggest you might “like”? Research shows that we don’t know “different others” as well, that we don’t trust and respect them as much.

What do you really know about your neighbours, colleagues, homeless people, or the Aboriginal people of the Kimberley? And what do you assume? We all are victims and perpetrators of these unconscious biases.

 width=

We make snap judgements based on appearance all the time. Image: iStock

Bias checklist

So what can we do to keep our biases in check?

1. Check your distance.

Distance is everything that stands between you and others. Perhaps you are really close to your partner or immediate family but further removed from relatives, colleagues, or the refugees at Manus Island.

Languages, technology, country borders, and generations (age) all create distance and make interaction harder. Do you have cliques at work of people from the same university, expertise, or nationality? Do you reach out beyond your cliques’ boundaries?

Go see your relatives, ask how your colleagues are, read newspapers, surf beyond what Facebook recommends. Be curious and get to know what you are now only assuming.

2. Check yourself.

Discover your biases by taking Harvard University’s Implicit Association tests here. Several surveys can help you identify your biases related to skin tone, religion, age, weight, sexuality, disability and many more.

As you start becoming aware of your biases, you might catch yourself acting on them. Embarrassment and shame are common with this realisation, but remember that everyone has biases. Apologise, rephrase, and move on.

Keep challenging your stereotypes. If you are biased against considering women for leadership positions, find female role models who challenge that bias or put yourself in an applicant’s shoes for a moment.

3. Check others.

Biases are everywhere, in your family, work team, even in your book club. We need to call others on their biases without embarrassing them or yourself.

Primarily, raise awareness about biases. Many organisations have bias trainings; Google makes the company’s training publicly available.

Once aware, everyone can commit to checking each other’s biases. In teams, for example, members can gently knock on the table to call out bias in a meeting without disrupting.

Simple processes can help teams and organisations alike. Company policies around the use of unbiased language in job advertisements, blind CVs, and structured interviews are the first steps to unbiased selection and hiring practices.

 width=

Sweet serenity – for some

For many people living in Australia, the serenity is pretty sweet. We are lucky to live in a country that has the opportunity to deliver comfort and prosperity for all its residents.

Yet every day, women, elderly, gay people, immigrants, Aboriginals, the homeless, refugees, and other groups are excluded from larger society as a result of unconscious biases.

We can’t get rid of bias, but if we carefully check our own and others’ biases, will all Australians one day experience the serenity?

Published on 24 Nov 2017

  • Unconscious bias
  • Discrimination

MORE ON THIS TOPIC

Breaking the silence on corporate pay secrecy

Breaking the silence on corporate pay secrecy

Five ways to reboot your organisation and develop diverse leaders

Five ways to reboot your organisation and develop diverse leaders

Could achieving gender diversity in leadership be as simple as changing the default in the selection process?

Could achieving gender diversity in leadership be as simple as changing the default in the selection process?

Subscribe to impact.

* Mandatory

First Name *

Last Name *

Organisation / Institute

Country Select ... Australia New Zealand --------------- Afghanistan Åland Islands Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, The Democratic Republic of The Cook Islands Costa Rica Cote D'ivoire Croatia Cuba Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guernsey Guinea Guinea-bissau Guyana Haiti Heard Island and Mcdonald Islands Holy See (Vatican City State) Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati Korea, Democratic People's Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People's Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, The Former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia, Federated States of Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands Netherlands Antilles New Caledonia Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestinian Territory, Occupied Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Reunion Romania Russian Federation Rwanda Saint Helena Saint Kitts and Nevis Saint Lucia Saint Pierre and Miquelon Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan, Province of China Tajikistan Tanzania, United Republic of Thailand Timor-leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States United States Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Viet Nam Virgin Islands, British Virgin Islands, U.S. Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe

My relationship** with Monash University (tick all that apply): I'm an alumnus, friend or supporter (including donors, mentors and industry partners) I'm a Monash student I'm interested in studying at Monash I recently applied to study at Monash I'm a Monash staff member I recently participated in research activities or studies with Monash Other

Submit your details

I agree to receive marketing communications from Monash Business School. Monash University values the privacy of every individual's personal information and is committed to the protection of that information from unauthorised use and disclosure except where permitted by law. For information about the handling of your personal information please see Data Protection and Privacy Procedure and our ** Data Protection and Privacy Collection Statements . If you have any questions about how Monash University is collecting and handling your personal information, please contact our Data Protection and Privacy Office at [email protected] .

You may republish this article online or in print under our Creative Commons licence. You may not edit or shorten the text, you must attribute the article to Impact, and you must include the author’s name in your republication.

If you have any questions, please email [email protected]

Creative Commons Attribution-No Derivatives

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 08 March 2021

Understanding unconscious bias

  • Barry Oulton 1  

BDJ In Practice volume  34 ,  pages 26–27 ( 2021 ) Cite this article

20 Accesses

1 Altmetric

Metrics details

What if you could improve the way you communicate with your patients, peers and team by recognising your unconscious biases and challenging them? Dr Barry Oulton explores.

Most of us don't like to think of ourselves as discriminating against others or having biases towards or against certain groups, but unconscious bias or 'implicit bias', as it is also called, is innate to human nature.

This is a preview of subscription content, access via your institution

Access options

Subscribe to this journal

We are sorry, but there is no personal subscription option available for your country.

Rent or buy this article

Prices vary by article type

Prices may be subject to local taxes which are calculated during checkout

Author information

Authors and affiliations.

The Confident Dentist, Vine Cottage, Hindhead, UK

Barry Oulton

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barry Oulton .

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Oulton, B. Understanding unconscious bias. BDJ In Pract 34 , 26–27 (2021). https://doi.org/10.1038/s41404-021-0685-8

Download citation

Published : 08 March 2021

Issue Date : March 2021

DOI : https://doi.org/10.1038/s41404-021-0685-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking and unconscious bias

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Unconscious Bias Training That Works

  • Francesca Gino
  • Katherine Coffman

critical thinking and unconscious bias

To become more diverse, equitable, and inclusive, many companies have turned to unconscious bias (UB) training. By raising awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character, it strives to make hiring and promotion fairer and improve interactions with customers and among colleagues. But most UB training is ineffective, research shows. The problem is, increasing awareness is not enough—and can even backfire—because sending the message that bias is involuntary and widespread may make it seem unavoidable.

UB training that gets results, in contrast, teaches attendees to manage their biases, practice new behaviors, and track their progress. It gives them information that contradicts stereotypes and allows them to connect with colleagues whose experiences are different from theirs. And it’s not a onetime session; it entails a longer journey and structural organizational changes.

In this article the authors describe how rigorous UB programs at Microsoft, Starbucks, and other organizations help employees overcome denial and act on their awareness, develop the empathy that combats bias, diversify their networks, and commit to improvement.

Increasing awareness isn’t enough. Teach people to manage their biases, change their behavior, and track their progress.

Idea in Brief

The problem.

Conventional training to combat unconscious bias and make the workplace more diverse, equitable, and inclusive isn’t working.

This training aims to raise employees’ awareness of biases based on race or gender. But by also sending the message that such biases are involuntary and widespread, it can make people feel that they’re unavoidable.

The Solution

Companies must go beyond raising awareness and teach people to manage biases and change behavior. Firms should also collect data on diversity, employees’ perceptions, and training effectiveness; introduce behavioral “nudges”; and rethink policies.

Across the globe, in response to public outcry over racist incidents in the workplace and mounting evidence of the cost of employees’ feeling excluded, leaders are striving to make their companies more diverse, equitable, and inclusive. Unconscious bias training has played a major role in their efforts. UB training seeks to raise awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character. Its goal is to reduce bias in attitudes and behaviors at work, from hiring and promotion decisions to interactions with customers and colleagues.

  • Francesca Gino is a behavioral scientist and the Tandon Family Professor of Business Administration at Harvard Business School. She is the author of Rebel Talent and Sidetracked . francescagino
  • KC Katherine Coffman is an associate professor of business administration at Harvard Business School. Her research focuses on how stereotypes affect beliefs and behavior.

Partner Center

  • Individual Programs
  • Download a Brochure
  • Facilitation
  • Learning Experience
  • Certificates

Unconscious bias: what it is and how to avoid it in the workplace

Callum Hughson | September 23rd, 2019

Unconscious Bias

An unconscious bias is a thinking error that can cloud judgment and lead to poor decisions.

As a leader, it’s important to look for and process a broad range of information from many perspectives. It’s equally important to be open to alternatives not previously considered. The more perspectives and strategies you have to choose from, the more likely it is you will make the best decisions for your team and organization as a whole.

But a powerful, yet subtle obstacle can stand in the way of open-mindedness in leadership: unconscious bias.

What is unconscious bias?

For most of human history, people experienced very little new information during their lifetimes. Decisions were based on the need for survival. In our modern world, we are constantly receiving new information and have to make numerous complicated choices each day. As many researchers have explained , our minds are ill-equipped to handle the modern world’s decision-making demands. Evaluating evidence (especially when it is complex or ambiguous) requires a great deal of mental energy. To save us from becoming overwhelmed, our brains have a natural tendency to take shortcuts. Unconscious bias – also known as cognitive bias – refers to how our mind can take shortcuts when processing information. This saves time when making decisions, which is especially helpful when we’re under pressure and need to meet deadlines. While these shortcuts may save time, an unconscious bias is a systematic thinking error that can cloud our judgment, and as a result, impact our decisions.

See if you can answer this riddle: A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

A woman holding her chin looking up into the air

Did you answer 10 cents? Most people do. Although this response intuitively comes to mind, it is incorrect. If the ball costs 10 cents and the bat costs $1.00 more than the ball, then the bat would cost $1.10 for a grand total of $1.20 for the bat and the ball. The correct answer to this problem is that the ball costs five cents and the bat costs (at $1.00 more) $1.05, for a grand total of $1.10.

If you answered 10 cents to the example above, your mind took a shortcut by unconsciously substituting the “more than” statement in the problem (the bat costs $1.00 more than the ball) with an “absolute” statement – the bat costs $1.00. It makes the equation easier to process: if a ball and bat together cost $1.10 and the bat costs $1.00, then the ball must cost 10 cents.

Our unconscious mind uses embedded, unconscious beliefs formed from our cultural environment and personal experiences to make immediate decisions about everything around us. The problem is that these shortcuts result in wrong decisions much of the time – especially when rational, logical thinking is required. We all have unconscious bias, and it influences our decisions without us even realizing it.

Common types of unconscious bias

An article in The Atlantic states there are at least 100 distinctive cognitive biases , while Wikipedia’s List of Cognitive Biases  contains more than 185 entries. Many of the unconscious biases listed, such as the IKEA effect (to place disproportionately high value on products you help create yourself) don’t present themselves often in the workplace. The following unconscious biases are the most common in the workplace and have the potential to derail your decision-making ability as a leader:

Sunk cost bias

You non-sensically cling to things that have already cost you something. When you’ve invested time, money, or emotion into something, it can be difficult to let it go – even when it is clear it’s no longer viable. The aversion to this pain can distort your judgment and can cause you to make ill-advised investments.

To combat this bias: ask yourself if you haven’t already invested time, money, effort, or emotion into something, would you still do so now? What advice would you give to a friend in the same situation?

Halo effect

The halo effect occurs when you allow your personal perception of someone (how attractive they are, how much you like them, how much they remind you of yourself) to influence your judgments about them, especially performance. In sociology, this is known as  homophily - people like people who are like themselves. 

To combat this bias: If you notice you are giving consistently high (or low) performance grades across the board to particular individuals, it’s worth considering your judgment may be compromised by the halo effect. Focus on the performance and not on the person.

The Dunning-Kruger effect

The Dunning-Kruger effect describes what happens when people mistakenly overestimate their own ability because of a lack of self awareness. Have you ever heard the phrase “you don’t know what you don’t know”? It’s easy to be over-confident when you only have a rudimentary perspective of how things are.

It also works the other way. Because experts are keenly aware of how much they don’t know, they can drastically underestimate their own ability and lose confidence in themselves and their decision-making ability. This bias is also known as “imposter syndrome.”

To combat this bias: acknowledge the thoughts you have about yourself and put them in perspective. Learn to value constructive criticism, and understand that you’re slowing your team down when you don’t ask for help.

If you’re feeling like an imposter, it can be helpful to share what you’re feeling with trusted friends or mentors. People who have more experience can reassure you that what you’re feeling is normal. Knowing that others have been in your position can make it seem less scary.

Availability heuristic

This unconscious bias influences your judgments by favouring the ideas that come most easily to mind. Similar to recency effect, the more recent and emotionally powerful your memories are can make them seem more relevant. This can cause you to place an inordinate amount of importance on recent memories and apply them to decisions too readily.

To combat this bias: use metrics and statistical information rather than relying on first instincts and emotional influences when making a decision.

The desire for conformity and harmony within a group results in an irrational or dysfunctional decision-making outcome.

To combat this bias: seek to facilitate objective means of evaluating situations and encourage critical thinking practices as a group activity.

Confirmation or Implicit Bias

Confirmation bias causes us to look for evidence confirming what we already think or believe in and to discount or ignore any information that may support an alternate view. It’s the most pervasive unconscious bias in the workplace and the most damaging.

“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” — Warren Buffett

Accepting information that confirms our beliefs is easy and requires little mental energy. With confirmation bias, when we encounter contradicting information we avoid processing it and find a reason to ignore it. In The Case for Motivated Reasoning , social psychologist Ziva Kunda wrote “we give special weight to information that allows us to come to the conclusion we want to reach.” In fact, neuroscientists have demonstrated that our brain reacts differently to information that confirms our previously held beliefs than it does to evidence that contradicts our current beliefs.

If a leader’s view is limited by confirmation bias, they may not pay enough attention to information that could be crucial to their work. Leaders need to be aware of how their biases might impact the people that work for them and with them. For example, direct reports may not share all available information, or may only tell a leader what they think their leader wants to hear. This can lead to poor decision-making, missed opportunities, and negative outcomes.

To combat this bias: think of your ideas and belief system as a piece of software you’re trying to de-bug, rather than a list of things to be defended. Ask yourself the following questions and be mindful of your thought process when answering them:

  • Where do I get information about the issues I care about?
  • Do my most common sources of information confirm or challenge my perspective?
  • How much time do I spend listening to or reading opposing points of view?
  • When I make decisions, am I likely to choose the option that the people closest to me will agree with?

Being cognizant of confirmation bias is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information and make decisions.

How unconscious bias can impact inclusion and diversity in an organization

The correlation between diversity and financial performance is clear across different industries and regions: more diverse teams translates directly to significant financial performance. Between 2011 and 2015, the most gender diverse companies were 20 per cent more likely than the least diverse to have above average financial performance.

For organizations to attract the most talented people and ensure a vibrant and diverse workforce, they need to select from a wide-ranging and diverse talent pool. Unfortunately, when hiring, assessing, or promoting employees, we often evaluate people against our unconscious assumptions of what top talent looks like. These assumptions can favour one group over others, even if members of each group are equally likely to be successful.

During the hiring process, hiring managers gather a wide array of information about job candidates. Through interviews, candidates will share their educational background, work and personal experiences, and how they would behave in hypothetical situations. But most of the time hiring managers are measuring this information against their own personal belief of what the successful candidate “should” look like. Did they go to the right school? Would they behave in the same manner as I would in the same situation? Is their personality a close match to mine (see halo effect above) and the rest of my team?

Most hiring managers will select candidates who best match their unconscious template of what a successful candidate looks and sounds like. This approach can give preferences to the “safe” choice. For example, a hiring manager may believe that only MBA graduates from elite business schools are suitable to fill leadership roles. And if that criteria were applied to all vacancies, you would soon develop a leadership team of predominantly white males, as most MBA graduates are male and white. Because diversity spurs innovation,  the organization would then be at a competitive disadvantage. 

Innovation is not just a nice-to-have benefit of having diverse work teams. It is an integral part of any revenue-generating organization. A Boston Consulting Group study found that organizations with more diverse management teams have 19% higher revenues from innovation alone. 

How unconscious bias can be avoided

Although unconscious bias can’t be cured, there are many steps that can be taken to mitigate it. Leaders who can recognize their unconscious biases and make adjustments to overcome them are more likely to make better decisions. To be ever-mindful of unconscious bias, it’s important to practice self awareness and slow down decision making to consider what is driving you. Ask yourself if your decisions are data-driven and evidence-based or if you rely on gut instinct? Have you asked for and considered different perspectives? It can be helpful to discuss your decisions and behaviour at work with an Ivey Academy executive coach.  An executive coach can provide a sounding board, a neutral perspective, and applicable strategies to help you overcome your unique unconscious biases. 

Promoting inclusion and diversity

To promote inclusion and diversity in your organization's hiring practices, appropriate procedures and processes need to be put in place. To eliminate bias in hiring decisions, make promotions fairer, and increase diversity, organizations are using data-driven talent assessments .

Organizations that use robust assessment tools have improved hiring success rates, lowered employee turnover, increased employee engagement and productivity, and fostered a resilient corporate culture. Assessments provide organizations with a consistent definition of what leadership potential looks like, regardless of race, gender, or ethnicity. With the help of assessment tools, leaders are able to find “ hidden gems ” — employees who have low visibility or who previously were not seen to have leadership potential. Most importantly, talent assessment tools help to educate leaders about the difference between an employee’s experience and his or her capability to take on new and more challenging responsibilities. With the help of talent assessments, you can be confident in knowing your organization is taking a needed step in removing unconscious bias from the hiring process.

The Ivey Academy’s talent assessment tools enable your organization to identify the best candidates for vacant roles and professional development. With our help, your organization can create and maintain a competitive edge in the recruitment, development, and retention of top talent. Learn more about our talent assessments here .

About The Ivey Academy at Ivey Business School The Ivey Academy at Ivey Business School is the home for executive Learning and Development (L&D) in Canada. It is Canada’s only full-service L&D house, blending   Financial Times   top-ranked university-based executive education with talent assessment, instructional design and strategy, and behaviour change sustainment. 

Rooted in Ivey Business School’s real-world leadership approach, The Ivey Academy is a place where professionals come to get better, to break old habits and establish new ones, to practice, to change, to obtain coaching and support, and to join a powerful peer network. Follow The Ivey Academy on   LinkedIn ,   Twitter ,   Facebook , and   Instagram .

You might also be interested in...

Choosing the right executive coach

Choosing the right executive coach

Because no universally reliable credential exists to identify credible coaches, it’s important to scrutinize an executive coach's experience and education.

Building a culture of innovation

The most important factor in building a culture of innovation

To create value for customers in ways their competitors cannot, organizations need to develop a culture of innovation that permeates all aspects of the business.

Identifying Talent And Managing Succession

Identifying talent and managing succession

For organizations to be able to remain competitive in an ever-changing global business landscape and labour market, having an identified group of talented employees who are able to fill key leadership roles is critical.

critical thinking and unconscious bias

  • Have any questions? Try our live chat! Chat

Sign up for virtual event invites and new content delivered monthly.

Receive our latest news, offers, learning content, and more.

Equality, Diversity and Inclusion

critical thinking and unconscious bias

Introduction to unconscious bias

What is unconscious bias?

We may not be aware of it, but we often place people into categories based on age, religion, race, gender and politics. This is unconscious bias - sometimes called implicit bias. We all have unconscious bias and it’s a natural and necessary survival mechanism that allows us to quickly assess situations, and decide promptly what action we should take, based on beliefs and previous experiences. It would be difficult to function if we approached every situation as if it were entirely new and had never been encountered before.

However, problems can arise when we make incorrect decisions based on flawed assumptions, beliefs and experiences. Being aware of our own biases and making changes in our work can help us minimise the risk of making poor decisions. There are three main types of unconscious bias:

Affinity bias - could cause us to recruit or promote people who look similar to us, or have similar language, names or culture. We might be more likely to view people like us (or who we like) positively, and more likely to notice negative traits of people unlike us, or who we don't like. Confirmation bias - we might unconsciously look for evidence that supports our pre-existing beliefs, and ignore evidence that contradicts it. Social comparison bias - we may be more critical of people who we are in competition with, or who we see as potentially better at things than we are. We might not be aware of this, and it might affect how fairly we treat others who we feel threatened by.

[email protected] +44 (0)1904 324680 Twitter

In the short video Understanding Unconscious Bias , The Royal Society presents how the human brain works and how our natural propensity to make quick, unconscious decisions and judgments in the moment can lead to us making incorrect decisions and assumptions.  It covers in-groups and out-groups, where some groups of people favour others who they feel are like them.

Other videos in the further information section below explore other perspectives on unconscious bias, including assumptions about gender, recruitment and employment, and how embracing diversity in our workforce helps us to provide better products and services.

You can identify your own biases using the Harvard Implicit Bias Test - a series of short, fun and engaging online tests designed to help identify where we might have unconscious biases on age, religion, sexual orientation, disability, trans people and race. A consideration of the limits and interpretations of such tests, in particular to measure implicit sexual attitudes in heterosexual, gay and bisexual individuals, can be found in this journal article by Anselmi et al (2013) .  

Reflecting on unconscious bias might help us to explore:

  • Why 60% of biology undergraduates are female, but only 30% of biology academics are female at York, while nationally only 20% of biology academics are female
  • How we can support more women, and Black, Asian and Minority Ethnic (BAME*) staff with career progression, and address our gender and ethnicity pay gaps at York 
  • Why there is a need to diversify and decolonise curricula
  • Why some voices are louder, and some people are more visible than others

The impact of unconscious bias at the University might mean that we don't recruit, promote, nurture or include people fairly and consistently. Lack of role models may affect staff retention and student recruitment. We might see fewer women in senior academic roles, even though they are more numerous at undergraduate level, and we may see fewer Black, Asian and Minority Ethnic staff in senior leader roles. It might affect staff retention, so staff might seek career development elsewhere. It might affect how staff feel about working at the University and with their colleagues. 

For example, take a look at our Gender and Ethnicity pay gap report for 2022 , which highlights pay and progression disparities between groups. A couple of illustrations are provided below, which show that we have fewer female than male staff in senior positions, and the pay disparity between staff from minority ethnic backgrounds and white staff.

Bar chart showing gender pay gap for males and females by grade

Table: University BAME pay gap

The mean pay gap between a BAME member of staff and a white member of staff was 14.8% in 2021 and in 2022 is 14.5%. The median was 18.6% in 2021 and in 2022 is 20.9%.

Table: University ethnicity pay gap 2022

We might be biased in our recruitment processes, and appoint people who we think are like us, who we see as less threatening to us, or who have views that coincide with our own.

We might unconsciously run social events that only attract part of our staff community, making them feel welcome and included, but making others feel excluded and less welcome, and preventing some people from developing strong personal and professional relationships, networks and associations.

In meetings we might not listen to some groups of people as much as we listen to others, and some people might not be present at all. For example, younger people may be less likely to be asked for their opinion on issues, or brought into the conversation if they are not contributing as much, and our meetings might not reflect the diversity of staff across the University. We may be unaware this is happening.  

People with particular characteristics might not be as visible as others, especially in senior leadership roles, which may unconsciously and negatively affect how staff and students view the suitability of people, including themselves, for particular roles (see Equality Challenge Unit, 2013, Unconscious Bias and Higher Education , for an exploration of the research on this).

* When using the acronym/term ‘BAME’ in our work we recognise that this does not fully capture the nuance and experiences of different ethnic identities, including the individual and cultural challenges faced by people in these communities. 

For pay gap analysis, data is aggregated into a broad group including Black, Asian and minority ethnic identities to enable analysis of the difference in experience compared to the White majority. Aggregated groups do not fully reflect the complex and nuanced experiences of individuals included in these data groups.   

For more information on language use please see the University's Using appropriate language when referring to race and ethnicity guidance.

Katie Oates: the University's approach

Katie Oates, Development Partner, explains the University’s dual approach to tackling unconscious bias - encouraging awareness of unconscious bias within individuals, and encouraging systemic change in our working practices.

Development Partner for Talent, People and Organisational Development, Katie Oates, on the University's approach to unconscious bias.

Katie Oates, Development Partner for Talent, People and Organisational Development, Human Resources, University of York

Katie explains the University’s dual approach to tackling unconscious bias - encouraging awareness of unconscious bias within individuals, and  encouraging systemic change in our working practices, in light of critiques that suggest individual awareness is insufficient on its own to lead to significant change.  We support the sharing of learning, so that departments can learn from the excellent work that others have done, and think about ways to improve their own processes.  

Staff are encouraged to ask themselves some critical questions about bias, including:

  • Are you vigilant and curious about the objectivity of your decision making?
  • Confirmation bias - are you influenced by information that supports existing beliefs? Do you make efforts to listen to arguments and opinions that differ from your own?
  • How does your emotional state affect your decision making?  Can your critical decision making be scheduled so that it isn’t overwhelmed by other priorities? 
  • In-groups and out-groups - are you making efforts to include all people in events and activities? 
  • What systems and processes in your departments could be changed to counteract bias?  For example, how can you slow down decision making and give people more time to make judgements? How can you reduce mental fatigue?
  • Challenging stereotypes - how can we influence what people see and hear in departments, and counteract stereotypes?  
  • How can we make decisions objectively, rather than on subjective feelings? 
  • What steps can we build into recruitment to ensure that sound decisions are made, such as providing sufficient time for discussion and reasoned judgments, and can we evidence that we judge people fairly against criteria agreed in advance?

Katie suggests we should also be proactive in recognising, and respectfully challenging, biases we observe in others, which we might be more likely to notice than our own.  

Katie has a long standing background and interest in tackling bias in employment:

“I have long been fascinated by human behaviour, emotions and thought processes, and as a chartered organisational psychologist I have a keen interest in the application of behavioural science to improve the fairness and objectivity of decision making in the workplace. It’s so important that we are respected, included, valued and treated fairly at work. During my time at the College of Policing I worked to design and deliver national selection processes for police recruitment and promotions, and saw first-hand the benefits of both supporting assessors to spot and explore their own biases (it can be quite a shift for people to understand that we are not as rational as we like to believe!) and the impact of designing environments, systems and processes that help mitigate bias in decision making. At York, I enjoy supporting the development of staff, including models and skills for objective assessment during staff selection processes.”

Katie highly recommends Daniel Kahnman's 'Thinking Fast and Slow' to gain insight into biases, errors in judgments and how to make better decisions. Written in an engaging way, it's available in the University library (see our information sources section for more information).

At the University we encourage staff to be aware of, limit and mitigate their unconscious biases.

In the past, we have provided training modules on unconscious bias awareness and unconscious bias in the recruitment and selection process.

However, one criticism of traditional unconscious bias training and awareness raising is that it concentrates on eliciting change in the individual in the belief that being aware of our biases is enough to enable us to overcome them. But critics suggest that biases can be deeply ingrained in our systems and processes so that being aware of our biases is not sufficient to enable change. That’s why we are taking a dual approach to tackling unconscious bias.

We do actively encourage individuals to be aware of their biases, but we also encourage systemic change by showing good practice across university so that departments can learn from the great work that others have undertaken. So taking this dual approach, what are some of the steps that we can take?

To build awareness of your biases, be vigilant and curious about the objectivity of your decision making.

When you’re gathering information, are you being swayed by content that supports your existing beliefs?

Are you aware of your emotional state and is it affecting your decision making?

Can you remind yourself to pay close attention to counter arguments and opinions that differ to your own?

And if you are organising a networking event, is it catering more to people in your IN group than your OUT group?

In terms of systemic changes, behavioural science research indicates that we can best to mitigate bias by altering the environment in which we make decisions to make ourselves less susceptible to the bias.

Think about how you could alter systems and processes in your department to do this.

How can you slow down decision making and give people more time to make judgements?

How can you reduce mental fatigue?

How can you alter what people see and hear in the workplace so that inaccurate stereotypes are not reinforced but are countered?

How can we help ourselves and others to make decisions based on objective evidence rather than subjective feelings or inaccurate or incomplete memories of events?

For example, when planning meeting agendas, try to plan in sufficient time and place more complex issues at the start of the meeting.

If you’re recruiting, create a timetable and structure that gives assessors sufficient time to make reasoned logical judgements rather than snap decisions.

Try to schedule critical decision making when you are more energised and not overwhelmed by other decisions that you’ve been making earlier in the day.

Similarly, research shows that it is really important to manage your blood sugar to avoid mental fatigue, another reason not to power through lunch or miss breaks.

And if you are making any kind of assessment or judgement of another’s performance, clearly set out your assessment criteria in advance and make thorough accurate notes of your performance observations.

Evidence suggests that we are much better at spotting others biases than our own. So help build a culture where we support each other by calling out biases when we see them and are open to learning about our own.

There are many more steps that we can take to mitigate bias, so do share learning and insight from your department with others. 

Katie Oates (University of York, HR Development Partner) provides a great overview of the ORCE method of assessment in recruitment and selection, and how we can overcome typical biases that we may encounter during the recruitment process.

Bodies beyond sex – what being intersex taught me about the world - Ronja Ziesel, TEDx Talks (15:03). A longer but captivating and insightful TEDx talk from an intersex person called Ronja Ziesel. It covers how people are socialised to make assumptions about people based on stereotypical expectations of sex, gender,and appearance. If you don’t have time to watch the video, the section from 02:23 to 04:00 gives a humorous and enlightening illustration of this process.

What is unconscious bias? - The Employers Network for Equality and Inclusion (03:20). Introduces unconscious bias, and the neuroscience behind it. This covers: how people place individuals into social categories based on visual cues gender, age, cultural background, and also on other grounds such as social backgrounds and job roles; how we become wired, or trained, by cultural stereotypes, such as only seeing women or men in certain types of job; and how affinity bias affects how managers may be more likely to favour certain individuals in recruitment and promotion where there is some sort of affinity based on personal characteristics, and behave negatively towards those where there is little affinity.

Unconscious Bias at Work - Making the Unconscious Conscious - Life at Google (03:58). Covers how people have a tendency to design our services, systems, and products to favour some individuals without being aware of it, because of our personal frame of reference and assumptions. This can affect accessibility of resources, where we may assume others know the things we know, or can do the things we can do.  Ultimately, being aware of our biases and having a wider perspective can help us develop more effective and successful initiatives that meet people’s needs better.

Unconscious Bias and Higher Education , Equality Challenge Unit (2013, now AdvanceHE) provides a broad literature review of a wide range of studies into unconscious bias, much of which is applicable to the Higher Education context, including recruitment biases (Steinpreis, 1999; Wood et al, 2009; Moss-Racusin et al, 2012), and how we might tackle bias by challenging stereotypes, provide broader representation in all that we do, and increase contact between diverse groups of people.

Complete your learning Now that you've learnt more about unconscious bias, you can go to the LMS and complete the quiz to demonstrate you have completed this learning.  You might also like to complete an action plan to record anything that you intend to take back to your department or service area for discussion.

See the next topic

Complete the quiz, decide your next steps, find out more about edi.

Kendall College of Art & Design

Critical Thinking & Evaluating Information

  • Critical Thinking Skills
  • Critical Thinking Questions
  • Fake News & Misinformation
  • Checkers & Cheat Sheets
  • Evaluate Using T.R.A.A.P.
  • Alternate Videos
  • Sources & Links

What is Bias?

                Sources of bias image bubble

Biases also play a role in how you approach all information. The short video below provides definitions of 12 types of cognitive biases.

There are two forms of bias of particular importance given today's information laden landscape, implicit bias and confirmation bias .

Implicit Bias & Confirmation Bias

Implicit / Unconscious Bias 

"Original definition (neutral) - Any personal preference, attitude, or expectation that unconsciously affects a person's outlook or behaviour.

Current definition (negative) - Unconscious favouritism towards or prejudice against people of a particular race, gender, or group that influences one's actions or perceptions; an instance of this."

"unconscious bias, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/88686003 .

"Thoughts and feelings are “implicit” if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge." 

https://perception.org/research/implicit-bias/

Confirmation Bias – "Originating in the field of psychology; the tendency to seek or favour new information which supports one’s existing theories or beliefs, while avoiding or rejecting that which disrupts them." 

Addition of definition to the Oxford Dictionary in 2019 

"confirmation, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/38852. 

Simply put, confirmation bias is the tendency to seek out and/ or interpret new information as confirmation of one's existing beliefs or theories and to exclude contradictory or opposing information or points of view.

Put Bias in Check!

                Who, what, when, where, why, how blocks image

Now that you are aware of bias, your personal biases and bias that can be found in sources of information, you can put it in check . You should approach information objectively, neutrally and critically evaluate it. Numerous tools included in this course can help you do this, like the critical thinking cheat sheet in the previous module.

  • << Previous: Critical Thinking Questions
  • Next: Evaluating News & Media >>
  • Last Updated: Sep 9, 2021 12:09 PM
  • URL: https://ferris.libguides.com/criticalthinking

Ferris State University Imagine More

.css-s5s6ko{margin-right:42px;color:#F5F4F3;}@media (max-width: 1120px){.css-s5s6ko{margin-right:12px;}} Join us: Learn how to build a trusted AI strategy to support your company's intelligent transformation, featuring Forrester .css-1ixh9fn{display:inline-block;}@media (max-width: 480px){.css-1ixh9fn{display:block;margin-top:12px;}} .css-1uaoevr-heading-6{font-size:14px;line-height:24px;font-weight:500;-webkit-text-decoration:underline;text-decoration:underline;color:#F5F4F3;}.css-1uaoevr-heading-6:hover{color:#F5F4F3;} .css-ora5nu-heading-6{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:flex-start;justify-content:flex-start;color:#0D0E10;-webkit-transition:all 0.3s;transition:all 0.3s;position:relative;font-size:16px;line-height:28px;padding:0;font-size:14px;line-height:24px;font-weight:500;-webkit-text-decoration:underline;text-decoration:underline;color:#F5F4F3;}.css-ora5nu-heading-6:hover{border-bottom:0;color:#CD4848;}.css-ora5nu-heading-6:hover path{fill:#CD4848;}.css-ora5nu-heading-6:hover div{border-color:#CD4848;}.css-ora5nu-heading-6:hover div:before{border-left-color:#CD4848;}.css-ora5nu-heading-6:active{border-bottom:0;background-color:#EBE8E8;color:#0D0E10;}.css-ora5nu-heading-6:active path{fill:#0D0E10;}.css-ora5nu-heading-6:active div{border-color:#0D0E10;}.css-ora5nu-heading-6:active div:before{border-left-color:#0D0E10;}.css-ora5nu-heading-6:hover{color:#F5F4F3;} Register now .css-1k6cidy{width:11px;height:11px;margin-left:8px;}.css-1k6cidy path{fill:currentColor;}

  • Leadership |
  • 19 unconscious biases to overcome and h ...

19 unconscious biases to overcome and help promote inclusivity

Team Asana contributor image

Unconscious biases are learned assumptions, beliefs, or attitudes that we aren’t necessarily aware of. While bias is a normal part of human brain function, it can often reinforce stereotypes. To combat unconscious bias, learn about different types of biases, how they might surface at work, and how to avoid them so you can build a more inclusive and diverse workplace.

That being said, these biases can lead to skewed judgments and reinforce stereotypes, doing more harm than good for companies when it comes to recruitment and decision-making. 

It’s especially important to be aware of these biases during the hiring process since they can impact the success of your future team.  

To help you recognize and combat unconscious bias in the workplace, we cover 19 unconscious bias examples and prevention strategies. Taking the steps to reduce biases will help you improve inclusivity, trust, and productivity within your company. 

What is unconscious bias?

Unconscious bias, also known as implicit bias, is a learned assumption, belief, or attitude that exists in the subconscious. Everyone has these biases and uses them as mental shortcuts for faster information-processing.

Implicit biases are developed over time as we accumulate life experiences and get exposed to different stereotypes. 

According to the Kirwan Institute for the Study of Race and Ethnicity , “These biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control.”

What is unconscious bias?

As a result, unconscious biases can have a big influence on our limiting beliefs and behaviors. When this translates to our professional lives, it can affect the way we hire, interact with colleagues, and make business decisions. 

If not properly addressed, these biases can negatively impact a company’s workplace culture and team dynamics. 

Although these biases are pervasive, you can reduce their impact with deliberate attention and effort. Being aware of and understanding the different types of biases that exist can help you find ways to combat them. 

Leading through change: Creating clarity and building trust

In this webinar, Asana experts outline concrete tips to guide your team through uncertainty. Learn how to help employees focus on what matters.

Leading through change webinar thumbnail

Types of unconscious bias

Unconscious biases manifest in different ways and have varying consequences. Some biases arise from judging people’s appearances, some are derived from preconceived notions, and others are borne of logical fallacies. We explore these common biases in detail below. 

1. Gender bias

Gender bias

Gender bias, the favoring of one gender over another, is also often referred to as sexism. This bias occurs when someone unconsciously associates certain stereotypes with different genders.  

This type of bias may affect recruitment practices and relationship dynamics within the company. An example of this bias during hiring is if the hiring panel favors male candidates over female candidates even though they have similar skills and job experience. 

Another well-known example is the gender pay gap. As of 2021, the average median salary for men is about 18% higher than women’s.  

The gender bias may reduce job and career advancement opportunities for certain populations.  

How to avoid gender bias

Here are some ways to create a more gender-diverse workplace: 

Set gender-neutral recruitment standards: Define the ideal candidate profile ahead of time and evaluate all candidates against those standards. 

Create diversity goals: Set qualitative gender diversity goals to create a more gender-balanced team. Support and provide resources for women to take on leadership roles. 

Ageism refers to stereotyping or discriminating against others based on their age, often happening to older team members. 

Although workers ages 40 and older are protected from workplace discrimination under the Age Discrimination in Employment Act, filing for a lawsuit against an employer can be a lengthy and costly process. 

Because not everyone files a complaint, ageism is still a prevalent issue. An AARP survey found that about 60% of workers age 45 and older have seen or experienced age discrimination in the workplace.

An example of ageism is if an older team member was passed over for a promotion, which ended up going to a younger team member with less seniority and experience. 

Companies that discriminate based on age may lose out on the valuable knowledge and experience that older workers bring. There may also be serious legal consequences if a team member decides to file a job discrimination lawsuit. 

How to avoid ageism bias

Preventing ageism involves combatting age-related stereotypes as well as engaging older team members in the workplace. Here are some ways to do that:

Don’t make assumptions based on age: For example, don’t automatically presume that older workers don’t know how to use technology or aren’t open to learning new skills. Provide equal learning opportunities for everyone. 

Foster cross-generational collaboration: Create two-way mentorship programs where a senior team member is paired with a new hire. This kind of collaboration facilitates communication between team members of different stages, which can help break down misconceptions about age.  

3. Name bias

Name bias is the tendency to prefer certain names over others, usually Anglo-sounding names.

Name bias is most prevalent in recruitment. If a recruiter tends to offer interviews to candidates with Anglo-sounding names over equally qualified candidates with non-Anglo names, this bias is present.  

Name bias can have a negative impact on diversity hiring and result in companies missing out on talented candidates. 

How to avoid name bias

A simple solution to avoid name bias is to omit names of candidates when screening. To do this, you can:

Use software: Use blind hiring software to block out candidates’ personal details on resumes.

Do it manually: Designate a team member to remove personal information on resumes for the hiring team. 

4. Beauty bias

Beauty bias refers to the favorable treatment and positive stereotyping of individuals who are considered more attractive. This has also given rise to the term “ lookism ,” which is discrimination based on physical appearance. 

An example of beauty bias is a hiring manager who is more inclined to hire candidates they think are good-looking. 

Hiring decisions should be based on skills, experience, and culture fit rather than physical appearance.

How to avoid beauty bias

Here are some ways to avoid beauty bias when screening job applicants:

Omit pictures from resumes: Focus on an applicant’s qualifications and experience when screening resumes.

Conduct telephone screening: Before scheduling an interview, consider doing a short telephone interview to get to know the applicant better without being influenced by their appearance. 

5. Halo effect

The halo effect, a term coined by psychologist Edward Thorndike in the 1920s, occurs when we develop an overall positive impression of someone because of one of their qualities or traits. 

This effect may lead us to inadvertently put people on a pedestal since we’re constructing an image of a person based on limited information. 

An example of this effect in recruitment is when a hiring manager sees that a candidate graduated from a prestigious school and assumes that they excel at their job. 

This halo is based on the hiring manager’s academic preferences. However, the school that someone went to doesn’t necessarily determine their level of job competency.  

By focusing too much on one positive trait, we may overlook negative behavior that could end up harming the company—for example, if a candidate was fired for misconduct in a previous job. 

How to avoid the halo effect

To reduce the impact of the halo effect, you could try out different interviewing strategies:

Conduct multiple interviews: Set up several rounds of interviews for candidates with different levels of management. That way, a candidate can be evaluated from various perspectives. 

Diversify your interview team: Getting someone from another team to interview the candidate may help since they’ll have less reason to “halo” them as they won’t be working with them directly. 

6. Horns effect

The horns effect is the opposite of the halo effect. This bias causes us to have a negative impression of someone based on one trait or experience. 

Putting too much weight on a single trait or interaction with someone can lead to inaccurate and unfair judgments of their character. 

For example, a new team member thinks the constructive criticism they received from their manager is harsh and assumes that their manager is a critical and stern person. 

If left unchecked, the horns effect can damage the cohesiveness and trust between team members. 

How to avoid the horns effect

In order to reduce the horns effect when interacting with others, try to: 

Challenge your first impressions: Take the time to get to know someone so you can develop a more concrete impression of that person as a whole.

Make judgments based on evidence: Ask yourself how you developed your first impression of someone and find evidence to support or refute that impression based on additional interactions. 

7. Confirmation bias

Confirmation bias

Confirmation bias is the tendency to seek out and use information that confirms one’s views and expectations. In other words, cherry-picking information to validate certain points. 

This affects our ability to think critically and objectively, which can lead to skewed interpretations of information and overlooking information with opposing views. 

For example, a product developer comes up with a product idea for the athletic market. Although market research shows little interest in the product, they try to validate the idea by reaching out to athlete friends who they know will support the idea. 

Although there’s gratification in validating a current idea, it’s important to consider the potential consequences of following through with the idea. 

How to avoid confirmation bias

Here are some ways to reduce confirmation bias:

Gather multiple sources: Whenever you’re testing a hypothesis or conducting research, gather information from a wide variety of sources to get a balanced perspective. 

Standardize interview questions : When recruiting new talent, come up with a list of standard interview questions to prevent asking off-topic or pointed questions that may or may not confirm your beliefs about a candidate. 

8. Conformity bias

Conformity bias is similar to groupthink, which occurs when we change our opinions or behaviors to match that of the bigger group, even if it doesn’t reflect our own opinions. 

This bias may occur when we encounter peer pressure or are trying to fit into a certain social group or professional environment. 

For example, a team is deciding between two proposals. One person thinks proposal A is better, but the rest of the team is leaning towards proposal B. That person is swayed by their opinions and ends up voting for proposal B because everyone else did. 

Although conformity can help prevent conflicts, it may also limit creativity, open discussions, and having other perspectives available. 

How to avoid conformity bias

Here are some ways to help encourage honest opinions in the workplace:

Use anonymous votes or surveys: The option to give feedback anonymously allows the freedom to express opinions without worrying about others’ preferences. 

Ask for opinions in advance: Before going into a meeting, have a private conversation with each team member to get their opinions. This gives everyone plenty of time to think about a topic and express their thoughts without the pressure of presenting in front of colleagues. 

9. Affinity bias

Affinity bias is also known as the similarity bias and refers to the tendency to favor people who share similar interests, backgrounds, and experiences. We tend to feel more comfortable around people who are like us. 

This bias may affect hiring decisions. For example, a hiring manager gravitates towards a job applicant because they share the same alma mater.

Over time, the affinity bias in hiring can hamper a company’s diversity and inclusion efforts. 

How to avoid affinity bias

While eliminating affinity bias entirely may not be possible, there are ways to reduce its effects:

Create a diverse hiring panel: Different people with varying perspectives and interests that conduct interviews can help reduce the affinity bias of one individual.

Go beyond hiring for “culture fit": The more hiring managers have in common with candidates, the more likely they are to evaluate them as a good “culture fit.” But the term "culture fit" is vague, and it can mean different things to different people. To assess candidates fairly, use specific language and examples when sharing feedback about them. Describe how well they embody company values or align with company missions. 

10. Contrast effect

We often make judgments by making comparisons. As a result, our judgments may be altered depending on what standard we’re comparing something to. This is known as the contrast effect.  

For instance, a team member is happy to receive a “meets expectations” on their performance review. However, they start to feel inadequate after finding out most of their colleagues got “exceeds expectations” on their reviews. 

Even though they got a decent review, the team member judges themselves more critically since their comparison standard is their colleagues’ results. 

There can also be positive contrast effects, which occur when something is perceived to be better than usual because it’s being compared to something worse. 

How to avoid the contrast effect

Here are some strategies to try when using comparisons to make decisions:

Make multiple comparisons: Instead of coming to a conclusion after making one comparison, compare something against different standards to broaden your perspective. 

Talk it out: Explain how you came to a given conclusion to your colleagues so they can understand your point of view. 

11. Status quo bias

This bias describes our preference for the way things are or for things to remain as they are, which can result in resistance to change. 

Following the status quo is a safe option and takes less effort, but it also results in becoming stagnant. As the business landscape continues to shift, change is necessary for business longevity and innovation. 

An example of the status quo bias in a company is continuing to hire team members from the same demographic group, making no effort to move forward with diversity goals. 

By repeatedly engaging in the same hiring practices, you may miss out on great candidates who can bring fresh ideas and perspectives to your company. 

How to avoid the status quo bias

Here are some ways you can challenge the status quo:

Use the framing effect: We often follow the status quo to avoid a loss, which we place greater weight on compared to gains. The framing effect involves looking at the default option as a loss to encourage exploring alternative options as gains. 

Encourage outside-the-box thinking: Create an environment that celebrates creativity and innovation. Adapt an open mindset to change so that your team can continue to push the status quo. 

12. Anchor bias

Anchor bias occurs when we overly rely on the first piece of information we receive as an anchor to base our decision-making upon. This causes us to see things from a narrow perspective. 

For example, the first thing a recruiter finds out about a candidate they’re interviewing is that they were unemployed for the past year. The recruiter focuses on this fact rather than the candidate’s solid qualifications and skills.

Instead of relying on one piece of information to make a decision, it’s important to look at the whole picture. 

How to avoid anchor bias

It takes time to make a thoughtful decision. Here are some tips to keep in mind:

Conduct thorough research: The first option may not always be the best one. Explore various possible options and their pros and cons before deciding.

Brainstorm with your team: Discussing a given decision with your teammates can help reveal the strengths and weaknesses of a plan. 

13. Authority bias

Authority bias

Authority bias refers to the tendency to believe in authority figures and follow their instructions. 

Generally, following a trusted authority figure with relevant expertise is a good idea. However, blindly following a leader’s direction without your own critical thinking may cause future issues.

For example, if a team member unquestionably follows their manager’s instructions to write a report in a way that matches the manager’s opinions, this could jeopardize the integrity of the report.

When receiving instructions on an area outside of your manager’s expertise, it can be worthwhile to seek additional information or expertise to minimize potential issues that may arise.

How to avoid authority bias

As with many unconscious biases, developing awareness of the bias is a good first step to countering it. 

Here is how to avoid being influenced by authority bias:

Ask questions: Don’t be afraid to ask your manager or company leader questions. The level of detail they provide may be an indicator of whether an idea was well thought-out or if it’s their authority coming into play. 

Do your research: Conduct your own research on a given topic to identify other credible sources or experts and see whether their suggestions align with your manager’s suggestions. 

14. Overconfidence bias

Overconfidence bias is the tendency for people to think they are better at certain abilities and skills than they actually are. 

This false assessment of our skill levels, stemming from an illusion of knowledge or control, can lead us to make rash decisions. 

For instance, an overconfident CEO decides to acquire a startup that they see high potential in and believe will bring high returns even though their performance indicates otherwise. 

Previous success or accomplishments may lead to an inflated ego. While leading with confidence is a good thing, it’s important to not let it get in the way of logical thinking and decision-making. 

How to avoid overconfidence bias

Here are tips to follow when you’re making decisions:

Consider the consequences: The decisions you make can have an impact on your company. Before committing to a decision, determine all the possible outcomes to ensure you’re prepared for them.

Ask for feedback: Getting feedback from your team can help you identify areas of improvement, whether it’s related to your performance or your ideas. Constructive criticism can keep egos in check.  

15. Perception bias

Perception bias occurs when we judge or treat others based on often inaccurate, overly simplistic stereotypes and assumptions about the group they belong in. It may involve other biases such as gender, age, and appearance. 

This type of bias may result in social exclusion, discrimination, and an overall reduction of a company’s diversity goals.

Say, for example, a team member doesn’t invite a teammate to an after-work social event because they assumed that they wouldn’t share similar interests with the group. 

Perception bias can make it difficult to have an objective understanding about members from diverse groups.

How to avoid perception bias

Reducing the impact of perception bias requires recognizing your biases:

Challenge your assumptions: Ask yourself, “How well do I really know that person or the group they belong to?” Don’t let preconceived notions prevent you from meeting or including new people. 

Think about the accuracy of statements: When you find yourself using strong words like “all,” “always,” and “never” to describe a certain group, pause and take a moment to ask yourself how accurate the description is. 

16. Illusory correlation

Illusory correlation is when we associate two variables, events, or actions together even though they’re unrelated to each other. 

For example, a hiring manager asks a candidate interview questions in an effort to gain insight into their personality but are unrelated to the job itself. Since the candidate struggles to come up with answers, the hiring manager decides they would not be a good fit.

These illusions can leads us to making decisions based on inaccurate correlations. 

How to avoid illusory correlation bias

We may be more prone to see false correlations in circumstances that we’re unfamiliar with or have little knowledge of. 

Here are tips to avoid making illusory correlations:

Get informed: Learning more about the areas you’re not familiar with can help you find evidence to support or refute the correlation. 

Consider all possibilities: When you associate two things, consider the likelihood of the cause and effect. You can also use a contingency table to visualize the relationships between the cause and effect. 

17. Affect heuristic

Heuristics are mental shortcuts that help us make decisions more efficiently. The affect heuristic occurs when we rely on our emotions to make decisions. This may help us reach a conclusion more quickly, though it may not always be accurate or fair. 

For example, an interview candidate makes an off-hand comment that offends a recruiter, though that wasn’t their intention. The recruiter decides to reject the candidate because they were vexed by the comment even though they were the most qualified candidate. 

Since emotions may cloud your judgment, it’s important not to make decisions in the heat of a moment. 

How to avoid the affect heuristic bias

Here are ways to lower the influence of emotions in different circumstances: 

Be aware of your emotions: Simply being aware of our level of emotions in a situation can help us step back from the situation and evaluate it more logically. 

Take time to reflect: Reflect on an event some time after it occurs. Your emotions likely won’t be as strong as they were during the event, so you’ll be able to come to a more objective conclusion. 

18. Recency bias

Recency bias occurs when we attribute greater importance to recent events over past events because they’re easier to remember. 

This bias is more likely to occur when we have to process a large amount of information. For example, since hiring managers often review a high volume of job applications in a day, it may be harder to recall candidates screened earlier during the day. 

Recency bias can also manifest during the interview process when a hiring manager becomes more inclined to make hiring decisions based on the most recent candidate they interviewed. 

To overcome this bias, using techniques to strengthen your memory can be helpful. 

How to avoid recency bias

Here are some tips to prevent recency bias when interviewing candidates: 

Take notes: Take detailed notes during each interview and review them afterward. This can help you keep track of notable candidates regardless of when you interviewed them. 

Give yourself mental breaks: Doing back-to-back interviews can be mentally draining. When your working memory takes a toll, you’re more likely to be affected by recency bias. Stay mentally alert by taking breaks in between interviews so your brain has time to absorb and remember the information.   

19. Idiosyncratic rater bias

Idiosyncratic rater bias affects the way we evaluate the performance of others. We often rate others based on our subjective interpretations of the assessment criteria and our own definition of what “success” looks like. 

In other words, we’re generally unreliable when it comes to rating other people. Research has found that about 60% of a manager’s rating is a reflection of the manager rather than the team member they’re rating. 

For example, a manager who excels at project management has higher standards for this skill and gives harsher ratings to team members for this skill. On the other hand, the manager is more lenient when rating team members’ marketing skills because they are less familiar with that area. 

Sources of rater bias may come from other biases, such as the halo effect, affinity bias, and confirmation bias. 

How to avoid idiosyncratic rater bias

Here are some strategies to avoid this bias when doing performance reviews: 

Set specific and clear assessment criteria: Create a rubric or a specific set of standards for evaluating performance. This prompts managers to provide supporting evidence based on a team member’s performance or achievements to determine how well they did.  

Conduct multi-rater reviews: This process involves a team member getting feedback from their colleagues and managers in addition to doing a self-evaluation. Having multiple reviews to draw from can help managers gain a more holistic view of a team member’s performance and identify potential areas for growth. 

Why it’s important to tackle unconscious biases

As these examples show, unconscious biases can hinder decision-making, impact team dynamics and leadership styles , and limit company diversity. This, in turn, can reduce equal opportunities for team members and job applicants. 

Tackling unconscious biases can help address these issues, as well as improve company diversity. 

Benefits of tackling unconscious bias

Increased company diversity can bring additional benefits such as:

Increasing company profitability: Teams that have solid problem-solving and decision-making skills can bring a competitive advantage to a company. For example, a McKinsey study found that gender-diverse companies were 21% more likely to gain above-average profitability.

Attracting diverse talent through inclusive hiring practices: By implementing inclusive recruitment strategies, companies are able to reach out to a wider talent pool. Job seekers would also be more likely to apply to companies that prioritize diversity. 

Increasing innovation: Diverse teams can bring a variety of fresh ideas to the table, allowing teams to come up with creative solutions that can drive sales. For example, a study by the Boston Consulting Group found that companies with diverse management teams bring 19% higher innovation revenue. 

Boosting company productivity: University research found that tech firms with diverse management teams have 1.32 times higher levels of productivity . Increased productivity can lead to more efficient project management and implementation. 

Encouraging higher employee engagement: Deloitte research showed that company diversity is directly related to employee engagement . Higher employee engagement can lead to higher job satisfaction, which in turn, can lower the turnover rate. 

Making fair and more efficient business decisions: Inclusive teams can make better business decisions up to 87% of the time. These business decisions can help improve a company’s performance and revenue. 

Be conscious of your unconscious biases

The good news: Once you’re aware of your unconscious biases, you can take steps to mitigate their effects. By taking micro-steps such as revamping your interview questions template and encouraging cross-team collaboration , you’re working towards a more diverse and inclusive workplace environment for you and your team.

Related resources

critical thinking and unconscious bias

100+ teamwork quotes to motivate and inspire collaboration

critical thinking and unconscious bias

Beat thrash for good: 4 organizational planning challenges and solutions

critical thinking and unconscious bias

Work schedule types: How to find the right approach for your team

critical thinking and unconscious bias

When to use collaboration vs. coordination

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

critical thinking and unconscious bias

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

critical thinking and unconscious bias

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

right-icon

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus convallis sem tellus, vitae egestas felis vestibule ut.

Error message details.

Reuse Permissions

Request permission to republish or redistribute SHRM content and materials.

Neuroleadership Lessons: Recognizing and Mitigating Unconscious Bias in the Workplace

critical thinking and unconscious bias

Unconscious bias, also referred to as implicit bias, impacts the workplace at all levels because it is a universal issue. Everyone has biases due to subtle cognitive processes within the brain that occur below one’s conscious awareness. Unconscious bias directly affects not only who gets hired, developed and promoted but also the ability of a team to be high performing, the effectiveness of leadership decision making, the health or lack thereof of an organization’s culture, and ultimately, the success of an organization as a whole. Because of its far-reaching consequences, it is imperative to assess to what extent an organization’s culture and business results are being impacted by unconscious bias and then take appropriate measures to mitigate the associated risk.

Brain Science – It’s all in the Wiring. Unconscious bias is innate to all human beings. As a result of the way that the brain is naturally wired, people instinctively prefer those who look, sound, and share similar interests. Neurologically, these preferences are unconscious and bypass rational thinking. Each day the brain processes billions of stimuli. This process takes place in the amygdala, the region of the brain associated with threat and fear. Information processed in the amygdala is used to survive, make assumptions, and feel emotions that cause one to be attracted to certain people (those in the in-group) but not to others (those in the out-group). Due to the quickness and efficiency of this part of the brain, bias often results for which the person is unaware.

Information received by the brain also travels through the hippocampus. This part of the brain forms links between memories and quickly deciphers the meaning of data received. When data received is matched to a person’s stored memories and personal stories, the brain processes that those stored memories are the “correct” ones. Outside of one’s conscious awareness, the brain seeks to reinforce just how right we are and, as a result, may cause us to make decisions based upon individual biases.

Other parts of the brain also play a part in unconscious bias. The left temporal lobe of the brain stores information about people and objects and is the place for social stereotyping. The brain’s frontal cortex is the area associated with empathy, reasoning, and forming impressions of others. The brain quickly processes and categorizes the vast amounts of information it receives and then tags that information with general descriptions that it can rapidly sort. Bias occurs when those categories are labeled as “good” or “bad” and those labels are applied to entire groups. While such categorizing helps the brain to make quick decisions about what is safe or not safe, this type of default wiring in the brain creates unconscious bias that is universal to everyone.

What Does this Mean for Me and My Organization? Most human decisions are made emotionally. The brain has an ingrained pattern of making decisions about others that are based on what feels safe, likeable, competent, and valuable. Bias towards what is similar drives decisions more so than actual merit. To compound this, unconscious processing in the brain governs the majority of important decisions we make. The brain is unable to simultaneously make a decision and at the same moment notice if that decision is biased. What this means it that because we have brains, essentially we are all biased.

In addition to individual bias, unconscious bias also occurs at the organizational level. Collective unconscious patterns of behavior have great and often long-lasting influence over organizational decisions and cultural thinking and interaction. These types of patterns perpetuate old, negative norms and keep unhealthy behavior firmly rooted at the expense of the good of the organization and its employees.

While the impact of unconscious bias can be significant, there is some good news. By effectively educating leaders about unconscious bias and challenging their thought processes around the crafting of policies as well as their decisions and practices pertaining to recruitment, compensation, staff development, and the equitable promotion of all different types of qualified individuals, then one can make a more expedient and meaningful impact across an organization than by generic “check the box” types of activities which often do very little to mitigate unconscious bias in the workplace.

Recognizing Unconscious Bias. There are more than 150 types of unconscious bias that are common to the workplace. Some of the types of unconscious bias that can impact an organization include:

  • Affinity Bias – Having the tendency to prefer or like those similar to oneself.
  • In-Group Bias – Perceiving those who are similar in a more positive way.
  • Halo Effect – Having the tendency to believe only good about someone because they are liked or letting someone’s positive qualities in one area influence the overall perception of that person.
  • Out-Group Bias – Perceiving those who are different in a more negative way.
  • Perception Bias – Having the tendency to form assumptions or stereotypes about certain groups thus making it impossible to make objective decisions about members of those groups.
  • Blind Spot – Identifying biases in others but not oneself.
  • Confirmation Bias – Having the tendency to seek information that confirms pre-existing beliefs or assumptions, or conversely to discount information that is incongruent with one’s assumptions.
  • Group Think – Having the tendency to try and fit into a particular group by either mimicking their behavior or holding back on sharing thoughts and opinions out of fear of potential exclusion.
  • Belief Bias – Having the tendency to decide whether an argument for something is strong or weak based upon whether one agrees with the conclusion of that argument.
  • Anchoring Bias – Having the tendency to rely heavily upon the first piece of information available rather than seeking out and fully evaluating multiple sources of information when making a decision.

Ways to Mitigate the Issue. The first step toward mitigating unconscious bias in the workplace is to increase awareness that the brain is programmed toward this tendency. Neuroscientist David Rock advises organizations to identify the various types of bias likely to be present in their workplace and then make a collective effort to overcome the negative impact of those biases. Along these same lines it can be beneficial to conduct confidential employee surveys to determine specific issues involving hidden bias and unfairness that might exist within the organization. Implicit Association Tests (IAT’s) such as the one offered by Harvard may also be utilized to unveil individual bias amongst leaders and increase their self-awareness.

Other ways to help mitigate unconscious bias include reviewing all aspects of the employment process such as applicant screening, interviewing, onboarding, performance evaluation, identifying high performers, mentoring, promotions, and terminations. By developing more robust processes for evaluating talent that include multi-trait and multi-method approaches, and then connecting assessment techniques with the decision making process, an organization is more likely to minimize bias while improving its talent management function.

It is also important to train leaders to become comfortable in coaching, providing feedback, and interacting more frequently with all staff under their area of responsibility, including those seen as less similar. To create high performing teams, leaders should strive to encourage collective input along with respectful debate, undertake rigorous evaluation of data, and seek holistic solutions. Leaders should also be more cognizant of their own biases, utilize disciplined thinking, and be open to multiple sources of information when making decisions.

Bringing it all Together. By gaining an understanding of how the human brain works, one can become more aware of the unconscious processes taking place in the brain when formulating opinions and making decisions. While bias exists in everyone, through concerted effort, the impact of unconscious bias can be diminished by increasing awareness and facilitating changes to thinking, behavior, and organizational practices. In doing so, leaders can increase productivity, create greater innovation, foster true inclusion, improve talent selection and management processes, and build healthier and more diverse workplace cultures which ultimately benefits everyone within the organization.

About the Author: Andrea Choate is a former top HR executive, professional writer, and specialized mindset coach to CEO’s and executives. She is an advocate for people strategies that facilitate business growth and profitability, as well as the value of an engaged, inclusive, and high-performing workplace.

Related Content

critical thinking and unconscious bias

Rising Demand for Workforce AI Skills Leads to Calls for Upskilling

As artificial intelligence technology continues to develop, the demand for workers with the ability to work alongside and manage AI systems will increase. This means that workers who are not able to adapt and learn these new skills will be left behind in the job market.

A vast majority of U.S. professionals  think students should be prepared to use AI upon entering the workforce.

Employers Want New Grads with AI Experience, Knowledge

A vast majority of U.S. professionals say students entering the workforce should have experience using AI and be prepared to use it in the workplace, and they expect higher education to play a critical role in that preparation.

Advertisement

critical thinking and unconscious bias

Artificial Intelligence in the Workplace

​An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.

HR Daily Newsletter

New, trends and analysis, as well as breaking news alerts, to help HR professionals do their jobs better each business day.

Success title

Success caption

IMAGES

  1. Critical thinking

    critical thinking and unconscious bias

  2. Cognitive and Unconscious Bias: What It Is and How to Overcome It

    critical thinking and unconscious bias

  3. 25 Unconscious Bias Examples (2024)

    critical thinking and unconscious bias

  4. 8 of the Most Common Biases in the Workplace

    critical thinking and unconscious bias

  5. Cognitive and Unconscious Bias: What It Is and How to Overcome It

    critical thinking and unconscious bias

  6. The Cognitive Biases List: A Visual Of 180+ Heuristics

    critical thinking and unconscious bias

VIDEO

  1. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  2. Actor-Observer Bias

  3. Critical Thinking Concepts: Status Quo Bias

  4. Say Goodbye to Sleepless Nights ★ Melatonin Release ★ Overcome Stress, Anxiety

  5. Healing Music for Sleep ★ Release All Blockages ★ Increase Melatonin in 5 Minutes for Better Sleep

  6. Say Goodbye to Insomnia ★ Release of Melatonin ★ Instant Relief from Anxiety & Stress

COMMENTS

  1. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  2. What Is Unconscious Bias (And How You Can Defeat It)

    Unconscious bias (also known as implicit bias) refers to unconscious forms of discrimination and stereotyping based on race, gender, sexuality, ethnicity, ability, age, and so on. It differs from ...

  3. Defeating Unconscious Bias: The Role of a Structured, Reflective, and

    Introduction. Unconscious or implicit biases are attitudes or stereotypes that arise from preformed mental associations, which influence our understanding, actions, and decisions in an unconscious manner. 1 Unconscious biases are universal and have adverse consequences for the workplace, health care, and the learning environment. 2 - 4 Studies show that clinicians' negative implicit bias ...

  4. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  5. PDF Understanding Unconscious Bias

    In his book, Thinking Fast and Slow, he describes the two systems that our brain uses for making decisions a as Systems 1 and Systems 2. System 1: Intuitive Thinking • Unconscious, automatic, emotional, fast, effortless System 2: Rational Thinking • Conscious, deliberate, systematic, slow and effortful Common Workplace Biases

  6. Implicit Bias (Unconscious Bias): Definition & Examples

    Implicit bias (unconscious bias) refers to attitudes and beliefs outside our conscious awareness and control. ... Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006). ... Because of the harmful nature of implicit biases, it is critical to examine how we can begin to remove them ...

  7. Cognitive Bias Is the Loose Screw in Critical Thinking

    Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...

  8. How to Identify, Understand, and Unlearn Implicit Bias in ...

    Explore and identify your own implicit biases by taking implicit association tests or through other means. Practice ways to reduce stress and increase mindfulness, such as meditation, yoga, or ...

  9. 2.2 Overcoming Cognitive Biases and Engaging in Critical ...

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  10. Cognitive Biases and Their Influence on Critical Thinking and

    Researchers have discovered 200 cognitive biases that result in inaccurate or irrational judgments and decisions, ranging from actor-observer to zero risk bias.

  11. PDF Understanding unconscious bias

    groups, but unconscious bias or 'implicit bias', as it is also called, is innate to human nature. We may not be aware of it, but we all ... thinking involved. It is the result of a subtle

  12. How unconscious bias shapes your thinking (and what you can do about it

    Unconscious biases are thought patterns; mental shortcuts. Everybody has them. We learn these tendencies over our lifetime because they help us. ... We can do a complex activity like riding a bike, without consciously thinking about it. In a very similar way, biases help us navigate a complex social world. Unfortunately, biases also have ...

  13. Understanding unconscious bias

    Unconscious bias happens automatically and without any thinking involved. It is the result of a subtle cognitive process in the subconscious of which we are not consciously aware. Conscious bias ...

  14. Unconscious Bias Training That Works

    Unconscious Bias Training That Works. Increasing awareness isn't enough. Teach people to manage their biases, change their behavior, and track their progress. Summary. To become more diverse ...

  15. Unconscious bias: what it is and how to avoid it in the workplace

    Although unconscious bias can't be cured, there are many steps that can be taken to mitigate it. Leaders who can recognize their unconscious biases and make adjustments to overcome them are more likely to make better decisions. To be ever-mindful of unconscious bias, it's important to practice self awareness and slow down decision making to ...

  16. Introduction to unconscious bias

    Staff are encouraged to ask themselves some critical questions about bias, including: ... Katie highly recommends Daniel Kahnman's 'Thinking Fast and Slow' to gain insight into biases, errors in judgments and how to make better decisions. ... Unconscious Bias at Work - Making the Unconscious Conscious - Life at Google (03:58). Covers how people ...

  17. LibGuides: Critical Thinking & Evaluating Information: Bias

    Confirmation Bias - "Originating in the field of psychology; the tendency to seek or favour new information which supports one's existing theories or beliefs, while avoiding or rejecting that which disrupts them." Addition of definition to the Oxford Dictionary in 2019. "confirmation, n." OED Online, Oxford University Press, December 2020 ...

  18. All inside our heads? A critical discursive review of unconscious bias

    Consequentially, a range of thinking, actions, and outcomes are understood, explained, and made sense of through a lens of unconscious bias. The recognition of the problem of bias, whatever the understanding of the underlying nature and causes, has produced a range of interventions that aim to increase understanding, expectations, and/or ...

  19. The impact of critical thinking, emotional intelligence, and

    Critical thinking and emotional intelligence act as powerful counterweights to unconscious bias, creating a harmonious synergy that nurtures an inclusive work environment: a. Critical thinking ...

  20. Fast thinking: How unconscious bias and binary language contribute to

    In this critical analysis, using Kahneman's fast and slow thinking, we argue that nurses working with hospitalised older people often rely on thinking quickly in hectic work environments, which can contribute to unconscious and conscious bias, use of binary language to describe older persons and nursing tasks, and ultimately rationing of care.

  21. 19 Unconscious Bias Examples and How to Prevent Them [2024] • Asana

    19 unconscious biases to overcome and help promote inclusivity. Team Asana. January 4th, 2024 16 min read. Summary. Unconscious biases are learned assumptions, beliefs, or attitudes that we aren't necessarily aware of. While bias is a normal part of human brain function, it can often reinforce stereotypes.

  22. Cognitive Bias List: 13 Common Types of Bias

    The Misinformation Effect. The Actor-Observer Bias. The False Consensus Effect. The Halo Effect. The Self-Serving Bias. The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.

  23. Neuroleadership Lessons: Recognizing and Mitigating Unconscious Bias in

    While bias exists in everyone, through concerted effort, the impact of unconscious bias can be diminished by increasing awareness and facilitating changes to thinking, behavior, and organizational ...

  24. Fast thinking: How unconscious bias and binary language contribute to

    scious bias. Methods: In this critical analysis, using Kahneman's fast and slow thinking, we argue that nurses working with hospitalised older people often rely on thinking quickly in hectic work environments, which can contribute to unconscious and conscious bias, use of binary language to describe older persons and nursing tasks, and ultimately