Wednesday, February 27, 2019

The (bias) problem with your AI matching tool | Evolvor Media

AI matching tools might be hurting you…

Are you using an AI matching tool as part of your recruitment process? That’s fine, so long as you think résumés predict performance.

The proliferation of such tools is the natural consequence of the evolution of the digital recruiting landscape. Recruiters want to find the right people faster, so talent acquisition technology vendors have taken the manual process of sifting through résumés and automated it. Instead of reading each application, clever software can read them for you and identify key phrases that may indicate a candidate is suitable for the job. The software then suggest the candidates who are likely to be the best fit.

Let’s stop and think about that for a minute.

No matter how efficient this process is, no matter how fast, it’s still an automated version of résumé screening. It’s taking statements candidates make about themselves – about how qualified they claim to be – and using them to decide whether they can do the job. Moreover, it’s focused on a candidate’s pedigree and experience, not their skills or attitude. It’s basically résumé screening on steroids. It’s taking the traditional hiring process, which we already know is flawed, and making it more efficient. But not more effective.

Candidates who look good on paper, or know how to game the system, are prioritized, while candidates who may not necessarily stand out on paper but are in fact qualified miss out.

Don’t make the same mistake as Amazon

But it gets worse. In addition to missing out on the best people, AI matching tools can do some serious damage, including to your brand, because they can introduce nasty bias into your process. It’s now widely know that Amazon scrapped its AI recruiting toolbecause it discriminated against women. It downgraded candidates from all-women’s colleges, to name one example. Really?

I’m not criticizing Amazon. At least they made the right decision once they found out.

The problem with all these methods is not artificial intelligence. Rather, it’s the data that is used to feed the algorithms. Matching candidates to jobs based on their background, or claims they make, is a flawed endeavour. It doesn’t matter how good the algorithms are.

Bas van de Haterd agrees, saying that “[t]here is no problem with selecting people using the help of AI. You just need the right data to select people on”.

According to Haterd “[a] CV tells me what you have done, for whom, and how long. It leaves out the two best predictors of future success: how well did you do, and under what circumstances did you do well?”.

So ask yourself this: is your recruitment software genuinely offering you a better way to identify talent, or is it merely digitizing the traditional hiring process? If you believe that résumés aren’t a good proxy for performance then you shouldn’t use tools that rely on those very same résumés to suggest who to hire. You should demand more of technology.

You deserve better

The problem with the traditional hiring process obviously doesn’t end with résumés. Job interviews aren’t a whole lot better than chance either. Just ask the BBC:

We’ve written extensively about the alternatives to traditional interviews.

Your choice of technology should be driven by what you believe the recruitment process should be like. It should reflect your values. Don’t settle for something that just offers an efficiency upgrade on the existing process.

To read more about how we use artificial intelligence, and why, read this.

The post The (bias) problem with your AI matching tool appeared first on Digital Media News & Training.

[from https://ift.tt/2loXS86]

No comments: