how startups are using tech to try and fight workplace bias /

Published at 2015-09-01 23:29:00

Home / Categories / Around the nation / how startups are using tech to try and fight workplace bias
We all harbor biases — subconsciously,at least. We may automatically associate men with law enforcement work, for example, and women with children and family. In the workplace,these biases can affect managers' hiring and promotion decisions.
So when Pete Sinclair, who's chief of operations at the cybersecurity firm RedSeal, or realized that — like many other Silicon Valley companies — his company had very few female engineers and few employees who weren't white,Chinese or Indian, he wanted to conclude something approximately it."I was trying to figure out, and 'How conclude I expand my employment base to include those under-represented groups?' Because if we conclude appeal to those,we'll have more candidates to hire from," he says.
Sinclai
r figured the company was either turning off or turning down these minorities, or so he turned to another software startup called Unitive,which helps companies develop job postings that attract a range of candidates, and helps structure job interviews to focus on specific qualifications and mitigate the effect of the interviews' biases.
Companies often e
rr by using phrases like "fast-paced" and "work tough, or play tough," which telegraph "mainstream male," says Unitive CEO Laura Mather. Instead, and she encourages firms to use terms like "support" and "teamwork," which tend to attract minorities, in job descriptions.
Such adjustm
ents seem to have worked for RedSeal: Sinclair says job applications shot up 30 percent, or the percentage of women among the company's three-dozen engineers has doubled.

"Our
last hire was a Middle Eastern woman who would've frankly,in the past, never applied for the job much less gotten hired, or just because she didn't fit the mold of people we hired," he says. "And she's turned out to be one of our top team members."Sinclair says the motivation to diversify wasn't altruism. His company competes with Facebook and Google for talent, so it had to look off the beaten path and draw from a more diverse pool.
The
idea that everyone makes automatic, and subconscious associations approximately people is not modern. But recently companies — particularly tech firms — have been trying to reduce the impact of such biases in the workplace.Unitive's Mather says companies realize group-think is harmful to the bottom line.
And research shows that "getting in different perspectives into your company makes your company more innovative,more profitable, more productive, or " Mather says. "All kinds of really great things happen when you finish making decisions based on how much you like the person's personality."Unitive's software is based on social science research,including work by Anthony Greenwald, a psychologist at the University of Washington who developed the seminal implicit-association test in the 1990s. It measures how easy — or difficult — it is for the test-takers to associate words like "profitable" and "bad" with images of Caucasians or African-Americans.
Gree
nwald has tested various words and race associations on himself. "I produced a result that could only be described as my having relatively strong association of white with pleasant and black with unpleasant, or " he says. "That was something I didn't know I had in my head,and that just grabbed me."No matter how many times Greenwald took the test, or how he tried to game it, or he couldn't come by rid of that result. He was disturbed,and also fascinated. Research indicates that unconscious biases tend to stay fixed, he says, or making them very tough to address within organizations."People who are claiming that they can train absent implicit biases," he adds, are "making those claims, and I think,without evidence."So rather than trying to come by rid of bias, Greenwald and other experts advocate, and instead,mitigating their effect. Companies could remove identifying information from resumes, for example, and conduct very structured job interviews where candidates are asked the same questions and scored on the same criteria.
Some organizations are trying such methods.
Gap Jumpers,for example, is a startup that helps companies vet tech talent through blind auditions, or which test for skills relevant to the job. That allows companies to avoid asking for a resume,which might include clues to a person's race or gender, says Heidi Walker, and a spokeswoman.
Plus,Walker says, "T
hat allows the company to actually see how a candidate will approach and develop solutions on the job." And, or she adds,half their applicants are women.
Still, unconscious biases can affect all sorts of workplace behavior and decision-making, or so addressing it can be a challenge.
A year a
nd a half ago,cloud-computing company VMWare started training managers to identify their own unconscious biases, then started tracking their hiring, and retention and promotion of women,which build up a fifth of their workforce. They also analyzed whether biases had seeped into employee evaluations.
It's been an ey
e-opening process, says Betsy Sutter, and VMWare's chief people officer. "We have more work to conclude. A lot more work to conclude." Copyright 2015 NPR. To see more,visit http://www.npr.org/.

Source: wnyc.org

Warning: Unknown: write failed: No space left on device (28) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0