Harvard Psychologist, Mahzarin Banaji coined the term “thumbprint of my culture.” She and many others have studied implicit bias. These are biases we all have and use without conscious awareness. Our brain works fast to make decisions constantly and develop mental shortcuts. These shortcuts become our biases.
Cognitive scientists have categorized over a hundred biases. The categories called the SEEDS Model stands for Similarity, Expedience, Experience, Distance and Safety.
Similarity is often referred to an ingroup vs. outgroup. We perceive people who are similar more positively and those who are different more negatively.
Expedience bias is if it feels easy and familiar it must be right. An example is confirmation bias when we look for evidence to support our beliefs and ignore contradictory evidence. And Halo Effect when we let someone’s positive qualities in one area influence our overall perception.
Experience bias is believing things are the way they seem. The Illusion of Transparency is overestimating the level your mental state is accessible to others. For instance, “I shouldn’t have to say I was —, anyone could see it.” Fundamental Attribution Error is when I believe my mistakes are caused by external circumstances and you created your own mistakes. For instance, I tripped because the sidewalk was uneven. When I saw you trip, I labelled you as clumsy.
Distance bias is closer is better; here and now vs. long -term investment. Instant gratification beats waiting for the prize.
Safety bias happens anytime there is a probability of risk. Threat is stronger than reward, so we avoid negative outcomes. Sunk Cost bias is when we have difficulty giving up something after investing in it, knowing the investment can’t be recovered.
Because biases are unconscious, we can’t stop. Confronting someone about their bias won’t work. Matt Lieberman, Social Psychologist & Neurologist at Harvard explains being right feels good and is associated with certainty and contentment. Being wrong activates regions of the brain associated with pain and results in negative emotion even when there are no material consequences.
Learning about biases, raising our awareness can help but we do not have the ability to fix our biases. However, teams can lessen the impact of their biases. The work must be done prior to decisions being made. Look at how critical decisions are made and focus on changing the process. Identify what biases may be in play and aim to combat the effects.
Something is working. Harvard has collected Implicit Bias data since 1998 and watched sexuality (anti-gay) bias drop by 33% in the last twelve years. Race and skin color bias is also dropping, but not as rapidly. Age, disability, and body-weight biases are still unchanged.
The first step is admitting you are biased. Whether or not you admit you are biased, take Harvard’s test @ https://implicit.harvard.edu/implicit/takeatest.html . There are many tests at this site. It’s free and confidential, you are contributing to their research and becoming more self-aware.