LISTEN LIVE KPR - On Air: Listen Live to classical, jazz and NPR news Schedule LATEST
KPR 2 - On Air: Listen live to KPR's all talk-radio service, KPR2 Recordings

Share this page              

How Startups Are Using Tech To Mitigate Workplace Bias

Listen to the Story

We all harbor biases — subconsciously, at least. We may automatically associate men with law enforcement work, for example, or women with children and family. In the workplace, these biases can affect managers' hiring and promotion decisions.

So when Pete Sinclair, who's chief of operations at the cybersecurity firm RedSeal, realized that — like many other Silicon Valley companies — his company had very few female engineers and few employees who weren't white, Chinese or Indian, he wanted to do something about it.

"I was trying to figure out, 'How do I expand my employment base to include those under-represented groups?' Because if we do appeal to those, we'll have more candidates to hire from," he says.

Sinclair figured the company was either turning off or turning down these minorities, so he turned to another software startup called Unitive which helps companies develop job postings that attract a range of candidates, and helps structure job interviews to focus on specific qualifications and mitigate the effect of the interviews' biases.

Companies often err by using phrases like "fast-paced" and "work hard, play hard," which telegraph mainstream male, says Unitive CEO Laura Mather. Instead, she encourages firms to use of terms like "support" and "teamwork" in job descriptions, which tend to attract minorities.

Such adjustments seem to have worked for RedSeal: Sinclair says job applications shot up 30 percent, and the company's percentage of women among three dozen engineers has doubled.

"Our last hire was a Middle Eastern woman who would've frankly, in the past, never applied for the job much less gotten hired just because she didn't fit the mould of people we hired," he says. "And she's turned out to be one of our top team members."

Sinclair says the motivation to diversify wasn't altruism. His company competes with Facebook and Google for talent, so it had to look off the beaten path and draw from a more diverse pool.

The idea that everyone makes automatic, subconscious associations about people is not new. But recently companies — especially tech firms — have been trying to reduce the impact of such biases in the workplace.

Unitive's Mather says companies realize group-think is harmful to the bottom line.

And research shows that getting in different perspectives into your company makes your company more innovative, more profitable, more productive," Mather says. "All kinds of really great things happen when you stop making decisions based on how much you like the person's personality."

Unitive's software is based on social science research, including work by Anthony Greenwald, a psychologist at the University of Washington who developed the seminal Implicit Association Test in the 1990s. It measures how easy — or difficult — it is for the test-takers to associate words like "good" and "bad" with images of Caucasians or African Americans.

Greenwald has tested various words and race association on himself. "I produced a result that could only be described as my having relatively strong association of white with pleasant and black with unpleasant," he says. "That was something I didn't know I had in my head, and that just grabbed me."

No matter how many times Greenwald took the test, or how he tried to game it, he couldn't get rid of that result. He was disturbed, and also fascinated. Research indicates that unconscious bias tends to stay constant, he says, making them very hard to address within organizations.

"People who are claiming that they can train away implicit biases," he adds, are making those claims, I think, without evidence.""

So rather than trying to get rid of bias, Greenwald and other experts advocate, instead, to mitigate their effect. Companies could remove identifying information from resumes, for example, or conduct very structured job interviews where candidates are asked the same questions and scored on the same criteria.

And some organizations are trying such methods.

Gap Jumpers, for example, is a startup that helps companies vet tech talent through blind auditions, which test for skills relevant to the job. That allows companies to avoid asking for a resume, which might include clues to a person's race or gender, says Heidi Walker, a spokeswoman.

Plus, Walker says, "That allows the company to actually see how a candidate will develop and approach solutions on the job." And, she adds, half their applicants are women.

Still, unconscious biases can affect all sorts of workplace behavior and decision-making, so addressing it can be a challenge.

A year and a half ago, cloud-computing company VMWare started training managers to identify their own unconscious biases, then started tracking their hiring, retention and promotion of women, which make up a fifth of their workforce. They also analyzed whether biases has seeped into employee evaluations.

It's been an eye-opening process, says Betsy Sutter, VMWare's chief people officer."We have more work to do. A lot more work to do."

Copyright 2015 NPR. To see more, visit

Tower Frequencies

91.5 FM KANU Lawrence, Topeka, Kansas City
89.7 FM KANH Emporia
99.5 FM K258BT Manhattan
97.9 FM K250AY Manhattan (KPR2)
91.3 FM  KANV Junction City, Olsburg
89.9 FM K210CR Atchison
90.3 FM KANQ Chanute
96.1 FM K241AR Lawrence (KPR2)

See the Coverage Map for more details

Contact Us

Kansas Public Radio
1120 West 11th Street
Lawrence, KS 66044
Download Map
785-864-4530 (Main Line)
888-577-5268 (Toll Free)