In my role as NIH Chief Officer for Scientific Workforce Diversity, every week I encounter hundreds of individuals and groups in the massive enterprise that is biomedical research. These include students, postdocs, and faculty; top leadership at NIH and at universities; NIH staff at all levels, policymakers, and reporters – and that’s just a partial list. Truly, my job is to engage people with my vision that enhancing workforce diversity through scientific methods can and will drive innovation.
As you might imagine, there is little time for doing homework about all the individuals I meet – and this is true not just in my work life but in my personal experiences as well. In fact, it is true for all of us, is it not?
With each new encounter, within milliseconds we “size up” faces, clothes, language, and behaviors. Without even knowing it, we make snap judgements about the people we meet and decide things about them. Where is she from? What does he think about me and about my research? What is her view on being the only female department chair at her institution? Did his parents go to college, and where? And so on.
To get answers to these questions quickly, we use visual clues, and our brains automatically combine these hints with prior knowledge to arrive at a reasonable answer. Importantly, we do this outside the realm of conscious awareness, as described in Timothy Wilson’s Strangers to Ourselves: Discovering the Adaptive Unconscious. These reactions come into play when we make important decisions such as in job searches and in interviews. Let me give you a few examples of research that informs this issue.
The first is a 2012 research study that shows how bias can influence hiring behavior. In this experiment, researchers asked science faculty at leading research universities (127 biology, chemistry, and physics professors -- half women and half men) to vet application materials from students applying for a lab-manager position. All science faculty received a student application that had either a female or a male name (Jennifer or John). Except for the names, the applications were identical. The results were clear and should worry us all: Both women and men faculty rated the male applicant as significantly more competent and hireable than the female applicant. Moreover, both women and men faculty suggested a higher starting salary and expressed more willingness to mentor the male applicant compared to the female applicant with identical credentials!
This study tells me that implicit bias, or unintentional use of cultural stereotypes to inform our thinking, feeling, and behavior, is one among many culprits of social imbalances in scientific fields. Most science faculty and scientists today, including myself, would not consciously believe that men and women have different abilities to do science (although they may use different approaches). Yet, many of us are exposed to cultural images or narratives that say science is something more strongly associated with men or masculinity (see my previous blog for more on this).
Another study demonstrates this phenomenon.
Published in 2015, this research study involved two groups of students who were shown a collage of 80 photos of real-life tenured/tenure-track STEM faculty (half women and half men). One group was asked to rate how feminine or masculine the people in the photos looked, and the other group was asked to rate how likely the person was to be either a scientist or an early-childhood educator. Results showed that the two groups of students provided to the researchers conducting the experiment different types of first impressions of the photos: The more feminine a female picture appeared, the less likely both female and male observers considered her to be a scientist -- and the more likely they assumed she was a teacher. That relationship did not hold up for male-faculty photos: both women and men were just as likely to see a male faculty member, regardless of the man’s perceived masculinity, as a scientist or as an early-childhood educator.
Associations like this become so habitual that we don’t even think much about it. However, they reside in our minds and are ready to be deployed when we need to quickly make sense of people and social situations. And they have real-life consequences. We know that from an early age most of us picture scientists as men, and the effects last well into adulthood. And we also know that implicit bias is a deterrent for women in science.
Is there anything we can do to change biases and stereotypes? Yes – studies demonstrate the value of implicit-bias education, paired with suggested behaviors to mitigate its effect.
My own past research, conducted at Stanford University before I joined NIH, is one of these studies. We asked this question: Will an educational bias intervention change faculty perceptions of gender and leadership in medicine?
The study Th The research involved 281 faculty from 13 different clinical departments such as radiology, surgery, pathology and oncology. We developed a 20-minute educational bias presentation that highlighted research on implicit bias and provided strategies for overcoming such biases. Both before and after the educational intervention, we measured the faculty’s implicit associations between women and leadership using the Implicit Association Test, or IAT. Our findings suggested that education paired with bias-mitigating strategies can reduce implicit bias among medical faculty.
Since I’ve moved to NIH, my team and I are exploring how to help scientists recognize potential bias, and to avoid have it affect hiring and other decision-making, so that we can recruit, hire, and retain the diverse talent we need to enable biomedical research to flourish. Like the rest of the nation, the NIH intramural scientific workforce lacks diversity, and our efforts go toward trying to change that.
NIH social scientists Anna Han and Janetta Lun have developed customized educational modules for search committees hiring research scientists into top NIH labs. We began this work in 2015 and have already provided bias education to about 300 investigators. We are still analyzing results, but it appears that the module has an effect on reducing perception of gender bias. Our next goal is to see effects on hiring behavior – do more women and people of color end up on “short lists?” Do they get call-backs? Do they come to work at NIH?
Education by itself is not enough to change the culture that is the ultimate source of implicit bias. To do so, we need to give people tools to identify bias and develop ways to respond to them. One exciting body of work with practical implications is “Bias Interrupters,” developed by Joan Williams at the Center for WorkLife Law of the UC Hastings Law School in San Francisco (check out her blog). This mode of thought proposes concrete steps that individuals and organizations can use to stop bias in its tracks. It focuses on recognizing problem patterns and offering effective behavioral responses.
Yes, bias exists in all of us. Yes, we can recognize it and reduce its impact. It’s also important that we be scientific in our approach, collecting data on what is working and how we might tweak our tools to adapt to different situations. Staying data-driven will help to maximize the effectiveness of what we do – and it will also help translate successes beyond NIH.
I will close with a call to action that diversity (gender and other) is not just a nice thing to have. Diversity of various types promotes innovation. Diversity isn’t only an issue of representation, or fairness, but it fosters higher performance, business productivity, and scientific quality. Diversity’s necessary companion is inclusion – a feeling of belonging in a group, personal or professional – and I will explore this important topic in a future blog.
Until then, take a good look at your first impressions, and allow yourself to think again!