Our biases and assumptions about others can be so automatic that they result in unintended thoughts that contradict our own beliefs. Even given our best intentions, we all hold some form of bias due to socialization and cultural stereotypes. Our implicit biases are essentially bad habits that stem from cultural learnings—they are a byproduct of our socialization and not a moral failing. If we are not aware of our biases, those habits can become activated and applied by default even when they may be undesirable and counteract our intentions.
The good news is that, like all bad habits, it is possible to break this bad habit of implicit bias, though it will take time and conscious attention. What differentiates those with lower prejudice is their unwillingness to apply stereotypes to a whole group. During the change process, an individual must “not only inhibit automatically activated information but also intentionally replace such activation with nonprejudiced ideas and responses” (Devine, 1998).
It can be difficult to correct our implicit biases because our assumptions often go unnoticed in everyday life. We don’t often receive feedback that confirms or dispels the assumptions we make about others. Our biases continue to live in our minds unless we unearth and intentionally confront them, asking how we know our assumptions are true.
Why is it important to be aware of implicit bias?
Regardless of how well-intentioned we are as instructors, implicit biases result from automatic thoughts. These can end up negatively impacting students and depriving them of opportunities and learning experiences.
In a study conducted by Moss-Racusin et al. in 2012, faculty members received the same resume and application materials in consideration for a laboratory manager position, with a random assignment of a male or female student name. The faculty members more frequently judged the female student to be less competent and less hireable and offered her a smaller starting salary and less career mentoring than the male student. This bias was independent of the faculty member’s gender, scientific discipline, age, and tenure status, which suggested that the implicit bias towards the female student was “likely unintentional, generated from widespread cultural stereotypes rather than a conscious intention to harm women.”
Interestingly, the faculty members actually reported liking the female student more than the male student. However, this did not translate into similarly positive perceptions of her competence. Faculty members of both genders seemed to be affected by cultural stereotypes about women’s lack of competence in science, despite not intending to dislike the female candidate. This shows the potential negative impacts of implicit bias. Despite good intentions, the continuation of such biases towards any group can have detrimental effects.
In a similar 2019 study by Eaton et al., they experimentally manipulated the gender and racial identities of CVs for postdoctoral researcher applications that STEM professors in biology and physics would then review. In line with the Moss-Racusin 2012 study, they found a gender bias, in which the physics professors favored the male students, and a racial bias, in which the physics professors perceived Asian and White students as more competent than Black and Latinx candidates. The biology faculty did not exhibit a gender bias, which Eaton et al. theorized might be because biology is a more gender-balanced field than physics. Biology faculty exhibited a racial bias of Asian students being seen as more competent than Black students. This study also found compounded racial and gender biases, where Black and Latina female candidates, as well as Latino male candidates, were rated as less hireable than other candidates.
Eaton et al. noted less bias in evaluating applicants with exceptionally strong records or clear differences in quality. Implicit bias is more likely to play a role in deciding between moderately and equally qualified candidates. The results of these studies exemplify the impact that implicit biases can have if left unchecked, as the faculty members in both studies did not consciously intend to be biased against these groups.
How to mitigate bias in the classroom
We can hypothesize how issues similar to those above can arise in the classroom when evaluating student performance on more subjective tasks (e.g., awarding points for class discussion, open-ended writing assignments or projects) or when assigning student grades at the end of the term when a student is close to a letter grade threshold (A/B, B/C, etc.). Additional areas where implicit bias can show up in the classroom include group work and resource allocation, such as opportunities (e.g., undergraduate research) and your time. To help mitigate the impact of implicit bias in the classroom:
- Pay attention to who you mentor and who participates in class. This can show up in class discussions, where our biases can lead us to (unintentionally) respond differently to student comments or call on certain students more than others. Another place our biases can impact the classroom is with participation. Our memory and biases may provide us with false accounts about which students participated the most or least unless there is an objective way to measure who is participating.
- Set criteria in advance. Create rubrics to help reduce bias during grading and share the rubrics with students when the assignments are given. Because you can make grading decisions based on those predetermined criteria, grading will likely be more objective. It provides a clear way for you and the students to be on the same page about what they did well and what they need to improve on an assignment.
- Structure time for making important decisions. It is difficult to be vigilant about bias when you are stressed or tired. Ensure you are well rested before grading exams or giving feedback to students. We also recommend taking a break in between grading multiple assignments. This will help you resist the temptation to make quick decisions, which is especially important for making more objective decisions that will affect others.
Ineffective strategies for reducing bias
- Stereotype suppression. Stereotype suppression involves trying to suppress a stereotype whenever it comes to mind. This strategy is not as effective as it seems because the more you suppress a thought, the more you will think about it. As a result, you may actually be more likely to view others through stereotypes because you are constantly trying to suppress such thoughts.
- Colorblindness. This strategy is the idea of ignoring aspects of another person such as race or gender. Though the intention — to treat someone “normally” — may be good, this is not very effective. Besides not being physically possible to do so, many people also derive pride from such aspects of their background. Additionally, the more you think about treating someone “normally”, the more you actually act differently towards them. This is similar to the idea behind stereotype suppression, where the more you think about trying to act a certain way, the less effective it turns out to be.
Research-based strategies to reduce bias
Breaking your bias habits is something that has to be practiced over time. Effective, research-based bias reduction strategies include:
- Stereotype replacement. While refraining from judgment, be attentive to patterns manifesting in your thinking. When you encounter an assumption, pause and ask yourself: “How do I know that about the person? Is it from a stereotype I have internalized, or do I have evidence from something actually happening?” By consciously surfacing and questioning your assumptions about others, you are intentionally replacing stereotypes with the individuating information unique to each person.
- Perspective-taking. Consider situational explanations. We tend to assume that an individual’s personal qualities or ability cause their behavior and be less attentive to the aspects of a situation that may have actually caused the person’s behavior. For instance, if a student does poorly on a test, it may not necessarily be because they are not smart or capable. There are many reasons why they may have performed poorly, whether it was lack of sleep, illness, personal problems, or too many exams within a week (to name a few). If you catch yourself attributing situational results to an individual’s internal characteristics, check your assumptions. It is important to think about how you actually know this and consider the possibility of a situational explanation.
- Commit to criteria. Before evaluating applicants or grading assignments, as mentioned in the previous section, it is helpful to have the same predetermined criteria and credentials against which to evaluate students or applicants. Research has found that bias is substantially less prominent when evaluators commit to criteria in advance of doing the evaluation. This allows evaluators to hold each other accountable and creates less room for in-the-moment decisions, creating opportunities for implicit bias and automatic thoughts we don’t intend.
- Modify the environment. Evaluate what messages are in the environment about who belongs or succeeds, and seek to increase the representation of underrepresented groups. Increasing opportunities for genuine interaction with members of other groups, whether in the classroom or outside of it, can broaden perspectives and recognize individuating information about people.
- Speak up against bias. “Authority” figures can hold a lot of power and act as effective allies. However, non-authority figures can also help speak up against bias and encourage the community to act supportively. For example, a female colleague proposes an idea at a meeting that is later attributed to a male colleague who repeats it. To gently help the meeting attendees realize the misattribution, you could affirm, “Right, as Mary proposed earlier, I think that’s a great idea” or otherwise point out the similarities between the ideas. Another key consideration in speaking up against bias is tone of voice, conveying your intent to understand or clarify rather than ridicule or accuse, which could prompt defensiveness. Additionally, using concrete instances rather than abstract accusations will allow others to be more amenable to the explanations and solutions you offer.
To learn more about research-based strategies to reduce bias, check out the Breaking the Bias Habit learning bundle on Atlas (MIT Touchstone authentication required).
References
Devine, P. G. (1989). Stereotypes and prejudice: Their automatic and controlled components. Journal of Personality and Social Psychology, 56(1), 5-18.
Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of Experimental Social Psychology, 48(6), 1267-1278.
Eaton, A. A., Saunders, J. F., Jacobson, R. K., & West, K. (2019). How Gender and Race Stereotypes Impact the Advancement of Scholars in STEM: Professors’ Biased Evaluations of Physics and Biology Post-Doctoral Candidates. Sex Roles, 82(3-4), 127-141.
Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences, 109(41), 16474-16479.
Shaw, Y. & Natisse, K. M. (Host). (2017, June 15). The Culture Inside [Audio podcast episode]. In Invisibilia. NPR. https://www.npr.org/programs/invisibilia/532950995/the-culture-inside.
MIT Teaching and Learning Lab. (2019, Oct 9). Drs. Devine & Cox: Empowering People to Break the Bias Habit [Video]. Available to the MIT community in the Atlas Learning Center: atlas.mit.edu.