AI tools fail to reduce recruitment bias - study
BBC News recently published an article on a Cambridge study about the use of AI in job recruitment. Find the article (and study) here.
Similar to AI’s application in other fields like criminal justice, its deployment in the field of job recruiting aimed to omit the human bias that comes with human evaluation of candidates that potentially considers a candidate’s gender or race. Using AI to analyze a photo or video of a candidate was thought to have the potential of removing that bias. However, as was seen when Amazon’s job-recruitment tool ended up disadvantaging female candidates on the basis of their gender (using their comparatively lower employment in the field as a proxy for gender), AI can, in fact, propagate bias in these scenarios instead of evading it.
The Cambridge paper offers four main claims about AI-powered recruitment tools: first, that removing gender and race entirely from the consideration criteria may not have the beneficial outcome that was intended by its removal; second, that trying to make hiring more equitable with the use of AI may be a misguided effort that upholds the inequality already in place; third, that the supposed neutrality of using AI doesn’t remove the power dynamic between the candidate and hirer; and finally, that AI instead decides to analyze the words and bodies of the candidate in making its decision.
The last claim in particular arises some moral quandaries — AI analysis of speech and bodily movements through video and image analysis can mirror the pseudoscience of fields like phrenology, in drawing unfounded claims about a candidate based on minor changes in their behavior. The dubiousness of decisions made based on this technology has been proven by making minor adjustments in contrast, brightness, or saturation in a given photo or video — and noting the change in the AI’s score as a result.
The relatively new technology’s deployment in a field with consequences as important as hiring comes with its risks, including inconsistent rules, criteria, and training data. Moreover, stripping gender and race as factors in making hiring decisions has been proven to be insufficient and even futile in removing bias from recruitment.