Psychology has long highlighted the disconnect between what people say, and what they do. “I became really interested in a particular kind of stereotyping and prejudice,” says Professor Trish Devine, recalling her experiences 35 years ago at Ohio State University. “Many people at that time were professing egalitarian values and non-prejudiced attitudes. Yet when you looked at their behaviors, they often did things that were inconsistent with their values – like lack of eye contact, interpersonal distance and other indicators. These are non-consciously mediated responses, they are automatic. We don’t think about them. We just do them.” So began her journey to understand why, and help people change.
There was then, as there is now, a persistent disparity between humans’ perceptions of themselves as fair dealers and the enduring reality of unconscious bias. Yet Devine wasn’t prepared to simply dismiss human beings as instinctive species controlled by subconscious prejudice and leave it at that. “My field discounted what people said about their values because these spontaneous behaviors showed bias,” she recalls. “This was basically the idea: ‘Don’t trust what people say – look at what they do.’ But that explanation didn’t square with what I saw people struggling with: that humans often have non-prejudice values and yet remain vulnerable to treating others in biased ways through these spontaneous, automatic, unconscious responses.”
Devine brands this phenomenon unintentional bias. “For me, the important component of these automatic processes is that they are unintended,” she says. “They are often at odds with, and in conflict with, what people intend to do.” There is no need, she says, to “treat their having bias as a moral indictment of their character.” When people become aware of their biases and the conflict with their intentions – and are given guidance on how to address those biases – they act, she adds.
It is absurd, suggests Devine, to blame individuals for evolutionary processes. “It is not something nefarious,” she says. “Rather, it is quite ordinary in that the way each of us is socialized leads us to make these associations. They become very well-rehearsed, practiced, and spontaneously activated. The problem becomes the conflict between our intentions and the unintentional responses.”
Stereotyping is what the human brain does to fill gaps in its knowledge. It applies them to groups about which there are well-defined social stereotypes. We develop stereotypes regarding race, age, gender, religion – or any group for whom we have strong associations learned from parents, peers and prevailing culture.
Devine frames unintentional bias as a “habit of the mind.” Force of habit causes us to do many undesirable things. We wish to rid ourselves of a bad habit, yet subconscious impulse maintains it. Recognition that those impulses exist – and are powerful – is the first step to overcoming them. “It is automatic, just like biting your nails or cracking your knuckles or whatever habit you may have,” Devine explains. “But once we start to think about it that way, then we can start to think about what’s involved in breaking habits.”
She draws on a childhood example. “Growing up, I bit my nails. My mom provided all the warnings: ‘You’ll get germs, it’s not good for your teeth.’” As Devine grew older, her mother’s warnings became starker. “I remember her saying: ‘If you carry on biting them, you won’t look professional.’” Devine recognized that her mother was right, and that her maternal concerns were well-founded. Yet regardless of how often Devine Sr. reminded her daughter of her bad habit, nothing much changed. “Then, at some point, I decided it was important to me – that I no longer wanted to bite my nails,” recalls Devine. “That’s the first motivational principle: you have to want to break the habit and the motivation cannot come from outside.”
Having found the inner drive to change, Devine began to monitor her behavior. “The second thing I had to do was become aware of when I bit my nails,” she recalls. “I became a detective of sorts. I scoured my experience and I figured out when it was that I was most vulnerable to biting my nails.”
As it transpired, Devine discovered that the impulse was at its strongest when she sat down to write, and her fingers were poised above the typewriter. “I needed to detect when I did it – otherwise you can’t disrupt the habit,” she says. The next step was to find an alternative action. “Once I disrupted the habit, that would enable me to not bite my nails, and do what I intended to do, which was write my papers,” says Devine. “So I started placing my fingers on the typewriter keys as a tactile reminder of what I was supposed to be doing. And, as anyone who has broken a habit knows, it takes effort. I had to work at it.”
Fingernail-biting is an everyday example that speaks to a serious dynamic in the human brain. “We do the same for habits of the mind,” says Devine. “Detect, reflect, and reject. But mental habits are harder to break because they are more challenging to detect.”
Strategies for change
Detecting and breaking those mental habits may be challenging, but it is possible with the right strategies and techniques. “I’ll give you an example,” she says. “There was a girl in one of our experiments, a young student, who caught herself making an assumption at a party that a black man who was at the party was on the football team. And, as she did it, she said to herself, ‘that’s such a stereotypical idea.’ One way we disrupt this type of habit is to ask ourselves, ‘how do I know?’ If you don’t know anything unique about a person, stereotypes fill in the gaps.”
The young woman used two tools to break those habits. “The right thing for the girl to do was go meet this guy. So she went and had a conversation with him, and it was a great conversation. She learned a lot about him, including that he likes football – because he likes football, not because he’s black.” She was using two simple tools: making contact, and, through conversation, the tool of individuation.
In the end, says Devine, the young woman was also using a third tool: perspective-taking. It is about standing in the other person’s shoes, and it helps in three ways. First, recognition that it’s not fair to make assumptions. Second, the thought: ‘I wouldn’t want that to happen to me.’ Together, these points lead people to start to care more about the experience of others, generating empathy for that experience, leading to point three: slow down, detect more and regulate the expression of bias.
Yet while the science to defeat bias exists, many professional countermeasures ignore it, warns Devine. Diversity training has been around for decades. It has, by some estimations, become a billion-dollar industry, but is it working? “I would say we haven’t made great progress,” says Devine. “It typically goes something like this: someone in a company reads some literature and it seems like a good idea; they implement the program but they don’t really evaluate it by systematically examining the impact of the training.
“In organizational settings where you’ve seen diversity training become really big business, when the impact has been evaluated, very often what the evidence suggests is that the training had no effect or it made things worse. The application of these kind of training programs has gotten way ahead of the science.”
Diversity of people, diversity of thought, diversity of ideas, are crucial to a company’s performance, yet diversity training is seemingly built on an evidence base made of sand. Devine is changing that, bringing the science back into a sector that is in critical need of it. “I didn’t set out to become a bias trainer,” she says. “I really wanted to understand what it was that people could do to learn to regulate the expression of their own biases. I developed the program (see box), and then, as a social scientist, I tested it in randomized controlled experiments where I could randomly assign people to receive the training or not.
“I then studied the outcomes that I thought should change if the training was effective and continued to make the program stronger. We measured if people were becoming more effective at both detecting and regulating the expression of bias. Were they more concerned about discrimination as a problem? Did they make an effort to address it?”
The results are clear. “In our training we find the answer is yes,” says Devine. “People who go through the training are more aware of their biases. They’re more concerned about discrimination. They are willing to put effort into addressing these issues.”
Devine recalls some work she did with a STEM (science, technology, engineering and math) faculty. The project focused on gender bias and concerns about women working in STEM disciplines: not feeling respected, not feeling like they belonged, and leaving the faculty at a much higher rate than their male counterparts.
“The under-representation of women and the fact that they don’t feel comfortable in those departments was viewed as a really important problem that implicit or unintentional biases may have contributed to,” says Devine. The US’s National Institutes of Health sponsored the work, in the hope that Devine’s bias-breaking workshops would be effective at reducing gender bias in STEM disciplines.
The workshops were delivered over several months at the University of Wisconsin. Devine made assessments three days later and three months later, with a final assessment two years later. “At three months – even in three days – we found things you might expect,” recalls Devine. “Those in intervention departments were more aware of gender bias, were more concerned about it, reported that they were taking steps to address it, and said that they had a sense of efficacy to address it.
“And so that’s all good. That’s all encouraging. But it was self-reporting. So what we wanted to do was to look at behavioral indicators that would be more convincing or provide at least additional evidence.” Devine’s team partnered with a group on campus that undertakes so-called ‘climate’ surveys at the university – assessments of behaviors and culture across the campus. “We were looking for the extent to which behaviors in the intervention group would be different compared to the control group – those departments that did not receive the intervention,” she says. “And what we discovered, in fact, was that intervention department members reported a better climate, overall, than control department members. And what I mean by that is that women and men in intervention departments reported that their work was better respected compared to control departments. They felt like they belonged better in those departments than their counterparts in control departments.
“And they could raise issues that – though historically gendered and stigmatizing – were felt to be less stigmatizing. Things like having to leave early to pick up children from school or to care for an elderly parent. You know, you think historically that was, quote, ‘women’s work’ – that women would take care of those things because they’re caring, sensitive and attentive to the need of others, as the stereotype holds.”
The risk and reward for taking on family responsibility had been shared inequitably. Women had been stigmatized for taking ‘time off’ for childcare or eldercare, and had received little credit from their peers for demonstrating care for their families. By contrast, when men requested leave to meet their family responsibilities, they were both seen as committed to their careers and championed for being family orientated. That changed in Devine’s intervention departments. “The asymmetry seemed to dissipate,” she recalls. “Family was seen as an important activity for everyone. So that was encouraging.”
Yet a deeper dive unearthed more potential treasures. “Here’s where it gets really interesting!” says Devine. She partnered with her HR department to look at gender representation in intervention and control departments in the two years before the study and in the two years after it. The project examined outcomes for females in so-called ‘tenure-track’ positions, long-term professorial hires that chart a clear path to the most senior positions.
The findings were breathtaking. Devine discovered that both the intervention and control departments hired women at equal rates in the two years before intervention – at a rate of 32-33%. “When you look two years after, what you find is that there is absolutely no change for the control departments,” says Devine. And for the intervention departments? “It jumped up to 47%! Which is a pretty dramatic increase and getting close to gender parity there.”
Devine’s science-based bias approach is rare: it actually changes things. Devine smiles: “It can work,” she says, “if it’s done right.” For her next act, Devine is working with children. A new frontier with an opportunity to shape better mental habits at a young age.
Michael Canning is global managing director of innovation and new commercial models at Duke Corporate Education and editorial chair of Dialogue.