Using technology to combat bias in hiring

Using technology to combat bias in hiring
Stephanie Lampkin MBA ’13 created an app that takes unconscious bias out of the online hiring process. Credit: Blendoor

Top tech companies have a diversity problem. Reports show that Facebook and YouTube—and many companies like them—struggle to build a workforce that is reflective of the U.S. population. Why? Some say that the lack of diversity comes from a lack of qualified candidates. Stephanie Lampkin MBA '13 argues that's just not true. That's why she launched Blendoor, a blind job app combatting what she says is one root of the diversity problem—unconscious bias.

Unconscious refers to the stereotypes, personal experiences, and cultural exposure that people unknowingly rely on when making a decision. Lampkin says that this is why hiring managers may be more impressed with that resemble themselves, without even knowing it.

"The bias is something innate, but there are ways we can use tech to eliminate it," says Lampkin. "We're not relying on the same traditional ways that tend to bring in homogenous teams."

With Blendoor, hiring managers use the app to sort through a diverse candidate pool without identifiers that can engage unconscious bias—like names, photos, and job dates. Hiring managers see candidate profiles based on how well they match their needs and nothing more.

Lampkin says the app matches companies with the most skilled candidates regardless of gender, age, or ethnicity. "It's a much bigger value proposition when you say is a great byproduct of the app," says Lampkin.

Because identities of candidates can only be hidden for so long, Blendoor also tracks how candidates move through the interview process—noting when a candidate is eliminated or gets hired. The app then uses this information to better match candidates in the future and identify at what stage bias may have come into play.

In addition to the app, Blendoor also offers BlendScore, a metric that ranks top companies based on diversity data, pay equity, and benefits like maternal and paternal leave. The metric serves as a tool for job seekers looking for diverse companies, but also informs companies when they need to make changes. "Shortly after we released a BlendScore for Facebook, they reached out looking to improve it," says Lampkin. "They're a customer now." The BlendScore relies on data shared by companies, packaging it in a way that is accessible. "We're hoping to be the like U.S. News and World Report pertaining to ethnicity, equality," she says. "The BlendScore shines the mirror back on the companies—it's all about transparency."

Lampkin has personal experience with bias in her own startup journey. She notes that only a handful of black women have raised $1 million from investors—something she wants to change. "I want to pave the way so it's not so rare for a venture capitalist to be pitched by a black woman as it is now," says Lampkin. "That legacy is really important to me."

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Citation: Using technology to combat bias in hiring (2018, March 26) retrieved 29 March 2024 from https://phys.org/news/2018-03-technology-combat-bias-hiring.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Bias creeps into reference checks, so is it time to ditch them?

11 shares

Feedback to editors