Fighting implicit bias in STEM with increased cognitive control
The brain carries out many processes automatically and without our conscious recognition. This means that when we encounter certain information — like the name on a resume suggesting a specific gender or race — we make an immediate and unintentional judgement. At the Building 68 Department of Biology retreat on June 14, keynote speaker Lydia Villa-Komaroff PhD ’75 explained the physiological roots of this implicit bias and offered potential solutions.
Villa-Komaroff is a biologist and business woman advocating for diversity in STEM. When she received her PhD from the Department of Biology in 1975, she was one of the first Mexican American women to receive a doctorate in the sciences. She served as the chief operating officer and vice president of research for MIT’s Whitehead Institute for Biomedical Research for two years, and later founded her own one-woman consulting firm, Intersections, SBD. She is a board member, former CEO, and former chief science officer of the biotech company Cytonome/ST, LLC, and a member of the Biology Department Visiting Committee. She is also a co-founding member of the Society for the Advancement of Chicanos/Hispanics and Native Americans in Science (SACNAS).
According to Villa-Komaroff, it’s not that STEM fields are completely without diversity. Rather, there are fewer members of underrepresented groups in positions of academic power relative to their peer populations. Women and underrepresented minorities tend to hold instructor roles or assistant professorships, and are less likely to become full professors, deans, and presidents.
“There has been some progress,” she said, “since the proportion of women and underrepresented groups has climbed. Women have climbed at a faster rate than have individuals from underrepresented ethnic groups, but the rate of increase in both of those groups is still slow relative to the changing population. Clearly something is going on in our society, and it has been going on for a very long time, longer than any of us have been around. So what might that be?”
Data are amassing, and not only from sociologists and psychologists, but from neuroscientists as well, Villa-Komaroff pointed out. Research has shown that humans are wired to make quick decisions that serve us well must of the time, but these inclinations can also cause us to misjudge the abilities of the person before us.
Since the brain is constantly confronted with a deluge of information, over the course of time it developed two systems to sift through all the input. System 1 is automatic: It’s running all the time, requires very little energy, and is crucial to our survival — permitting us to recognize danger and possible threats in a split second. It also allows us to complete habitual tasks, like playing the violin or holding a pipette, with very little conscious effort.
System 2 begets what we generally consider to be “thinking.” It is deliberate and requires a lot of energy to run. Often without our conscious awareness, System 1 overtakes System 2 and our decisions are driven by our instincts. Villa-Komaroff said we need to fight this tendency to “trust our instincts” when it comes time to select colleagues or students. It’s not simply about activating your thinking, it’s about challenging it.
“I’m sorry to say that we — that is those of us in the hard sciences — have been the most resistant to thinking that this might be in the case,” she said. “I can’t tell you how many times my colleagues have said to me, ‘This is not a problem for us because we care only about merit, and that is what we are basing our decisions upon.’ It’s true we care about merit, but that is not the factor on which we often base our first initial decisions.”
In fact, it has been shown that science faculty presented with two applications for a lab manager position, identical except for the names “Jennifer” and “John,” will evaluate John as more competent, give him more money, and offer him more career mentorship. The kicker is that these implicit biases aren’t just limited to a particular segment of the population. Women often have biases against other women, and the same is true for members of underrepresented groups.
But hope is not lost, Villa-Komaroff said. We can do something to counter this tendency if we just teach ourselves to recognize our own biases and deliberately work to override them.
In one study, researchers noticed that the panels at the American Society for Microbiology General Meeting consisted primarily of males. The panel committees that selected them also happened to be predominantly all-male. The researchers presented these data to the selection committees, and gave them an explicit call to action: Do something about it. The next year, the number of female speakers increased, and the number of all-male session planning committees decreased.
In another study, researchers took 92 medicine, science, and engineering departments from the University of Wisconsin at Madison and divided them into a matching control and test group, where the test group was invited to enroll in a short, two-and-a-half hour workshop on implicit bias. Despite the fact that, on average, just 25 percent of the faculty from the test group departments attended the session, afterwards they reported more self-initiated efforts to promote gender equity and better conflict resolution. Most notably, over the next several years the percentage of hires from underrepresented groups rose from 8 percent to 11 percent, while the controls saw a decrease from 10 percent to 5 percent.
As part of the Strategies and Tactics to Increase Diversity and Excellence (STRIDE) program at the University of Michigan, full professors must now attend workshops on implicit bias in the fall during peak faculty recruitment season. Between 2001 and 2007, the percentage of faculty searches resulting in a female hire rose from 15 percent to 32 in STEM disciplines. If nothing else, these kinds of interventions may kick in the second, deliberate decision-making system, and allow us to see past the name on the resume.