Part 4:  Strategies for mitigating and overriding implicit bias (spoiler: It's never-ending work!)

Part 4: Strategies for mitigating and overriding implicit bias (spoiler: It's never-ending work!)

To mitigate or override our implicit biases, we have to develop a “chronic awareness of bias” (Devine et al., 2012) in our everyday lives. In this final article in this series, I offer a set of research-based strategies for a two-pronged approach:

1)    Everyday strategies to weaken our reliance on implicit biases based on stereotypes and other distorted information; and

2)    Decision-point strategies for mitigating biases when making important decisions.   

Everyday Strategies

You may remember this old riddle:

A father and his son are in a car accident. The father dies at the scene and the son is rushed to the hospital. At the hospital, the surgeon looks at the boy and says, "I can't operate on this boy, he is my son."  How can this be?

Back when I was growing up in the 70’s and 80’s, this riddle was a real headscratcher. Female surgeons were such a rarity that it was beyond the imagination of many Americans that the surgeon in this scenario could have been the boy’s mother. 

Luckily, times have changed. So, I was certain that when I presented this same riddle to a group of millennials, the question would seem ridiculous. Indeed, they very quickly came up with the answer “The boy has two dads,” but it was only after some prodding that they landed on the “surgeon is the mother” response.

Turns out, I’m not the only one to retest this riddle. A professor at Boston University asked 197 psychology majors, as well as 107 youth aged 7-17, the same question (Barlow, 2014). While they also came up with the “two dads” response, along with some creative ideas regarding robots, ghosts, and mix ups, only 14% of the college students and 15% of the youth came up with the “surgeon is the mother” response.

One takeaway from this story is that even though we may be exposed to new ideas and images of people, some long-standing, broadly held stereotypes, images, and ideas are still deeply engrained and can prevail. And so, the work of mitigating and overriding the impact of implicit biases is something we need to engage in everyday, forever more.     

Strategy #1 – Increase awareness of your blind spots 

You need to learn to see what you don’t see. This is not as nonsensical as it may sound. Just like when we are driving a car, our blind spots are not invisible, they simply require that we make some adjustments before we can see them.  Here are two tools to reveal some of your blind spots:

Take a few of the Harvard Implicit Association Tests. These online assessments are designed to measure attitudes and beliefs that may be lying below the surface of your awareness. Be forewarned, the results can be uncomfortable. Rather than let your defenses rise, sit with it and contemplate what it might mean. 

Do a diversity audit. Take a closer look at who you know, who you like, who you trust. First, try the Trusted 10 activity, which asks you assess who is in your inner circle in terms of identities that are similar to and/or different from you. Next, do the same type of assessment with your social media circles. Who do you follow? Whose point of view shows up on your feed? Are you exposed to perspectives and ideas from people of different races, genders, ages, industries, and geographies?

By taking a closer look at who you know and whose voices/perspectives you engage with, you can reveal who is NOT represented in your personal and social media circles. These are groups of people who may be in your blind spots because you don’t have specific, individualized, accurate information about them stored in your brain. And that means that when the TSA Agent in your mind is doing those instant assessments, they may draw upon stereotypes and other distorted information instead. But don’t worry, the next strategy will help you overcome this issue.  

Strategy # 2 Consciously input anti-bias information (aka brain training) 

No alt text provided for this image

In the previous articles, I discussed how our TSA Agent (from second article) and our Co-Pilot (from the third article) rely upon the most easily available information stored in the brain to make quick judgements and assessments. Although those two metaphorical characters serve different functions, they rely on a similar set of stored information for taking cognitive shortcuts (aka implicit biases). This highly accessible information is often based on stereotypical and other distorted bits of information, especially when there is a lack of specific, individualizing, humanizing information. Therefore, the logical solution is to increase the latter. 

The tools I found in the research involve consciously inputting anti-bias information into your brain to counterbalance suboptimal data points (stereotypes, archetypes, distorted memories, etc.). Again, you won’t be able to erase biases and stereotypes, but you can insert more accurate information for conscious use. Think of these strategies as a daily regimen of brain-training. Pick one to do each day!  

Brain-training Activity #1: Increase intergroup contact. This strategy involves seeking opportunities to encounter and engage in positive interactions with people who are different from you. These positive interactions get stored in the brain and can be used to override negative implicit biases that may be triggered in various contacts (Pettigrew & Tropp, 2006). Of course, while the pandemic is still around increasing your contact with other people may not be advisable. In the meantime, you can increase your exposure to different people via social media. And, when it is safe to do so, you can join a club, organization, committee, or volunteer group that will give you an opportunity to engage with different types of people. I joined Toast Masters for a while and really learned about and felt connected to a group of people entirely different from myself (in terms of age, race, professional background, education), with the exception of our shared interest in enhancing our public speaking skills.

Brain-training Activity #2: Practice individuation. This strategy involves focusing on specific individual traits rather than group-based attributes to decrease your reliance on stereotypical ideas and images (Lebrecht et al., 2009). Of course, this is easier to do when we have the opportunity to meet and talk to new people, but there are individuation practices that you can do with actually talking to people. For example, when I go out in the neighborhood and encounter people of different races, or people with disabilities, or homeless people, I ask myself questions about them such as “I wonder if they like carrots?” or “I wonder if they have siblings?” Truth is, I’m not particularly interested in the answers, but I am pretty sure the practice of focusing on their individual characteristics distracts my brain from the stereotypical images and ideas it might normally be drawn to. 

Brain-training Activity # 3: Counter-stereotype exposure. This strategy helps create stored information that can override stereotypes. Simply keep your eye out for (or do internet searches for) counter-stereotype examples (e.g. female firefighters, male nurses, etc.) and make a mental space to store that information. Research also shows that even imagining counter-stereotype examples can create stored information that can be used in the next strategy (Blair et al., 2001).  

Brain-training Activity #4: Stereotype replacement. This strategy involves recognizing when your TSA agent or Co-Pilot seize upon a stereotype. First, you need to name the stereotype, question where it came from, and think about the harm that comes from it. Then, you consciously replace the stereotype with more accurate information or a counter-stereotype (Blair et al., 2001). 

Brain-training Activity #5: Perspective-taking. Once again, the best way to learn to see things from another person’s perspective is to really get to know them, their motivations, their histories, their experiences, their hopes and dreams. However, the last thing we want to do is put the burden of educating ourselves on people from marginalized groups. If you are actually friends with someone, you may learn that stuff naturally.  Luckily, we all have access to the lived experiences of others through the curated media collections (on streaming services and local libraries) by and about people of different races, ages, sexual orientations, gender identities, physical abilities, and so on. When you consume such media, take in the complexity of characters and try to feel what they feel. This practice increases psychological closeness and helps mitigate reliance on stereotypes (Galinsky & Moskowitz, 2000). One poignant example that I keep coming back to is the book Just Mercy by Bryan Stevenson. I found myself sobbing and outraged at the injustices as read about the lived experiences of people on death row. Now, when I hear about cases in the news, my fast-thinking Co-Pilot might go directly to “criminal,” but my slow-thinking mind wonders about their stories and the circumstances that lead them to where they are now.

The above strategies provide you with some simple practices that you can work on each day with the goal of increasing your awareness of implicit biases as well as increasing the anti-bias information stored in your brain for easier access by your TSA Agent and Co-Pilot

Decision-Point Strategies (for mitigating implicit bias)  

We all make a myriad of decisions each day and we rely on our fast-thinking co-pilot to help us navigate through. However, because our co-pilot is prone to systematic errors as discussed in the previous article, we need some strategies for avoiding these errors when it counts.

I will admit that I was a little disappointed when I got the end of Kahneman’s book because in the conclusion, he basically says that even after decades of research, his fast-thinking (Co-Pilot) is just as prone to systematic errors as when he started. However, he has gotten better at overriding those biases when it counts. He explains,

The way to block errors that originate in System 1 [fast-thinking, Co-Pilot] is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2 [slow-thinking, Pilot in high gear] (Kahneman, 2011, p. 417). 

The strategies I provide in this section are based on Kahneman’s advice. 

Identify your daily decision-points.  

The first step is to increase your awareness of your daily decisions that could be susceptible to implicit biases. Do an inventory of some of your work-related tasks that involves big or small decisions or judgments about people such as hiring, mentoring, recommending, providing information or services, evaluating, disciplining, and networking.

Learn to recognize danger zones! 

No alt text provided for this image

Not all of your decision points are going to susceptible to implicit biases, so it’s important to identify work conditions that are ripe for biases to come into play. 

Your co-pilot is more likely to make errors related to unconscious biases in situations when you are pressed for time and you need to complete tasks quickly. Sound familiar? 

We are also at risk under conditions of stress such as conflict at work, issues going on at home, or environmental stress (noise, people’s emotions, or other distractions). 

Multi-tasking is another risk factor. When our Pilot is required to do several things at a time, she is highly reliant on the Co-Pilot, who is in overdrive in these instances and definitely prone to errors. 

Another time that our Co-Pilot may heavily rely on cognitive shortcuts is when we are doing some of our standard daily routines. By nature, routines are easy because we don’t have to think about them, we just do them. This is a time when our Pilot kicks back and our Co-Pilot takes over, maybe even shifts into auto-pilot. This can be problematic because our Pilot cannot detect errors when in low-effort mode. 

Many of these danger-zone conditions are part of our daily work existence. So, it is important to be aware of when we are making important decisions or judgements under such conditions. 

Since awareness is always the first step, take a minute to review your decision-point inventory and identify which of those you might end up doing in an implicit bias danger zone.  

Slow down to engage slow thinking.  

The goal is to get our Pilot into high gear, which ironically means that we need to slow down our thinking (and doing). A helpful way to become skillful at slowing down is to regularly practice mindfulness. There are various definitions of mindfulness. At the most basic level, mindfulness is the practice of being attuned to both your internal and external experiences in the present moment. It entails quieting down and tuning in to your own thoughts, feelings, and bodily sensations. 

No alt text provided for this image

To help understand mindfulness, I find it’s helpful to look at related words. Some synonyms for mindfulness are alive, aware, cognizant, conscious, thoughtful, and knowledgeable. The importance of these concepts is really driven home when we think about the opposite of mindfulness: insensitive, oblivious, unaware, unconscious, unmindful, unwitting. These antonyms make it clear that lack of mindfulness can really get us into trouble. Not only does mindfulness lead to a variety of personal mental and physical outcomes, being mindful is also way to avoid the pitfalls of our unconscious biases. Even the simple act of taking a deep breath and exhaling slowly is a mindfulness practice. It brings oxygen to the brain, which is like caffeine for our Pilot, so it is very helpful in getting her ready for high gear.  

Get your Pilot into high gear. 

Our Pilot operates in the neocortex area of the brain, where higher-level thinking takes place. But again, our Pilot is only active and vigilant when called upon to do so. Here I offer three critical thinking questions that require your Pilot’s attention and can help identify errors that your Co-Pilot may be making.

What are you thinking?
Try to identify the stereotypes, assumptions, implicit associations, preferences, and logic that your Co-Pilot is using. Once identified, your Pilot can identify potential flaws.      
What are you missing?  
To help avoid confirmation bias, seek information that disconfirms your initial thoughts, feelings, judgements. It can be helpful to get another person’s perspective.
What other relevant information do you have stored in your brain? Remember that your Co-Pilot will go to the most readily available information stored in your brain, but if you’ve done your brain-training, chances are you have other information available, you just need to get your Pilot to access it.

These questions can be helpful in hiring processes, evaluating work, deciding who to invite to speak at your event, judging someone’s behavior, and so on. 

Conclusion

Getting better at overriding biases takes ongoing effort. It means being mindful as often as you can. It means grappling with feelings of discomfort. It means intentionally exposing yourself to people and ideas that are different than what you are used to. But I promise you that the work and effort are worth it. Not only will your own life be enriched, you will make better decisions and ultimately you will contribute to a more fair, equitable society. 

References

Barlow, R. (2014, January 16). A Riddle Reveals Depth of Gender Bias. BU Today. https://www.bu.edu/articles/2014/bu-research-riddle-reveals-the-depth-of-gender-bias/

Blair, I. V., Ma, J. E., & Lenton, A. P. (2001). Imagining stereotypes away: The moderation of implicit stereotypes through mental imagery. Journal of Personality and Social Psychology, 81(5), 828–841. https://doi.org/10.1037/0022-3514.81.5.828

Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of Experimental Social Psychology, 48(6), 1267–1278.

Galinsky, A. D., & Moskowitz, G. B. (2000). Perspective-taking: Decreasing stereotype expression, stereotype accessibility, and in-group favoritism. Journal of Personality and Social Psychology, 78(4), 708.

Lebrecht, S., Pierce, L. J., Tarr, M. J., & Tanaka, J. W. (2009). Perceptual other-race training reduces implicit racial bias. PloS One, 4(1), e4215.

Pettigrew, T. F., & Tropp, L. R. (2006). A meta-analytic test of intergroup contact theory. Journal of Personality and Social Psychology, 90(5), 751.

To view or add a comment, sign in

More articles by Marlo Goldstein Hode, LL.M., PhD

Others also viewed

Explore content categories