A startup claims to have finally figured out how to get rid of bias in hiring with brain games and artificial intelligence
- An AI startup called Pymetrics creates neuroscience-based games that eliminate the first step of the hiring process for corporations.
- The games surface each applicant's inherent traits, like having a good memory or aversion to risk.
- Though Pymetrics says its software can help rid bias in hiring, the technology is still new and experimental.
Think about your place of employment right now. Your family's background and your identity likely helped you get there.
You might have been lucky enough to have grown up in a good school district and gone to a university with a robust alumni network that led to job connections. You might've also had parents who could pay for a semester abroad or housing during an unpaid internship - things that look great on a resumé.These advantages give people a leg-up in their careers, regardless of individual work ethic or talent. That may be why a large body of research shows the hiring process is biased.
A tech startup called Pymetrics uses brain games and artificial intelligence in an attempt to rid the hiring process of unconscious biases, including classism, racism, sexism, and ageism. CEO Frida Polli told Business Insider that Pymetrics' algorithms do not account for the name of a candidate's school, employee referrals, gender, or ethnicity. Instead, they measure 70 inherent cognitive and emotional traits, including attention to detail, ability to focus, risk-taking, and memory.
In 2013, Pymetrics launched software that automates the first step of the recruiting process: scanning resumes. On September 20, the company announced it had raised $8 million, bringing its total funding to $17 million.
In the fall, with a grant from The Rockefeller Foundation, Pymetrics will launch a program to match disadvantaged young adults, ages 18 to 24, with companies nationwide.
How corporations use Pymetrics
Right now, Pymetrics works with 40 to 50 companies, including big names Unilever and Accenture. Most of the companies are large, because the software needs a lot of employee data to generate an accurate algorithm.
To create an algorithm, between 100 and 150 of a company's top performers play a series of neuroscience-based games. The game that measures risk aversion, for instance, gives users three minutes to collect as much "money" as possible using this system: clicking "pump" inflates a balloon by 5 cents; at any point, the user can click "collect money." If the balloon pops, the user receives no money. The user is presented with balloons until the timer runs out.Here's a screenshot:
A cautious user who takes a small amount of money from each balloon is neither better nor worse than an adventurous user who takes each balloon to its limit. They just receive different types of scores.
After top performers finish all 12 games, the company then creates a custom algorithm that reveals a trait profile for the ideal candidate.
When a candidate applies for a job, they are asked to play the same series of games. Recruiters can then see a candidate's results compared with benchmarks from the company's top-performing employees.
Those who receive scores closest to the ideal trait profile move on to the next round, which is usually an interview.
Pymetrics"What does the resume tell a company that's really that relevant?"
Polli said the goal for Pymetrics is to replace the act of looking at resumes, not human recruiters.
"In an entry level role, as a freshly graduated college kid, what does the resume tell a company that's really that relevant? I was an English major, and I became a neuroscientist. There's no direct line there," she said.She added that the software reduces the chances of ethnic and gender discrimination, at least in the first round. Research has shown that white men have an advantage in the hiring process, especially for jobs in male-dominated fields.
These kinds of industries, including tech, law, and finance, also have a diversity problem. A 2014 analysis from USA Today, for example, found that black and Hispanic college students are graduating with computer engineering and science degrees at twice the rate they're getting hired.
Polli admits that computers are just as likely to have gender and ethnic biases as humans, since the latter programs the former.
"Let's take Fortune 500 CEOs. Less than 5% are women, and it's the same for ethnic representation. There are more guys named John than female [names] in this group. If you were to use that sample to predict who makes a good CEO, the name John would be really predictive," she said. "That's how bias gets introduced. Variables associated with a particular demographic group get picked up by the algorithms. And if you're not actively checking for that, you're going to perpetuate it."
To limit that kind of bias, Pymetrics adjusts its algorithm for each company. The startup creates a reference group of 10,000 people that have used Pymetrics. Unlike the new applicants, the company knows the genders and ethnicities of the reference group. If the team notices, for example, that men are receiving higher scores than women on a given trait, it will de-weight that trait in the software's model.
When Unilever began a hiring overhaul last year, it used Pymetrics and HireVue (which uses facial recognition to analyze interview questions) for 250,000 applicants. Unilever told BI it hired its "most diverse class to date" in North America from July 2016 to June 2017. The company said there was a "significant" increase in non-white hires, though it wouldn't disclose specific statistics. Unilever hired a nearly equal number of men and women as well.
Does it work?
As others have noted, there are dangers in relying too much on data analytics in hiring. Cathy O'Neill, a mathematician, wrote an entire book on the subject, called "Weapons of Math Destruction." If a company's top-performing employees are mostly white, male, and young, basing an algorithm on their profile will likely make that algorithm biased toward candidates who look like the top employees, she wrote in the book.
Polli said that's why it's important to continually correct algorithms - which are designed by humans with biases - to limit that from happening.As Mic notes, this kind of technology is still new and experimental. It's also only used at the first stage of the recruiting process. Even if a candidate makes it to an interview, a recruiter's unconscious bias still could affect their chances of getting the job.
Polli is optimistic that this technology could give more less-privileged job candidates more of an equal shot.
"Economics [are] a huge barrier to getting a good job, because you don't have the right school or the right internship. That shouldn't get in the way," she said. "We're trying to bring back the American Dream, in that everyone should have the opportunity to good jobs. It doesn't matter what your race or gender or socioeconomic background. We think that all those factors should become irrelevant."