Prejudice in the hiring process; is AI the help we need?
- Prejudice in the hiring process; is AI the help we need?
Many companies are making a concerted effort to tackle prejudice in the hiring process, including unconscious biases such as reacting to a person’s age, name or university. For example, we’ve seen anonymised CV’s or ‘blind hiring’ processes where details such as the candidate’s name and university is kept secret during the CV screening stage.
Of course, these details will have to be revealed eventually and by the time an offer is made it is likely the decision maker will be aware of the candidate’s personal details that were originally redacted. So, although this technique may help in the initial stages the same potential problems will have to be overcome later in the hiring process.
A modern alternative recently thrown into the mix is an AI that can learn what characteristics professionals currently successful the role display – how they speak, facial expressions and language used. This has been employed by corporate giants such as Unilever in the initial stages of the application process and can be done remotely via phone or laptop. The claim from the developers of this software, HireVue, is this allows for more applicants to be screened without relying on CVs and can provide a more objective judgement free from human bias.
However, experts and campaigners have warned that an AI of this nature will inevitably have built in biases from the data fed to it from current successful employees. This would not be surprising, seeing as if you’re trying to improve diversity in the workplace, using the current work force as an example to the AI seems to be a flawed system.
A key pillar of the AI’s appeal is its timesaving capabilities with HireVue claiming one million interviews delivered worldwide every 90 days. This begs the question whether the main selling point of the AI is to save time rather than battle prejudice.
A way to combat this could be to ensure the AI used has access to an incredible range of diverse data to prevent a bias outcome.
The Human Factor
In truth, this is where people should be stepping in to use a human approach to tackle prejudice in the hiring process. People may have their own inherent biases, but we have the capacity for change, can use emotional intelligence to assess candidates from varied backgrounds, and not rely on established candidate archetypes.
People’s education and backgrounds have been used historically to judge calibre and is an easy fall-back option for recruiters with little else to go on.
They key is to make sure your recruiter understands the rationale so they can change their thought process accordingly and pay close attention to how and why they opt candidates in or out of the process.
For example, if a recruiter is acutely aware of what is needed to be successful at the job, they can focus on that as opposed to more general elements such as a candidate’s name, where they went to school and their age. Additionally, if not specified, your recruiter will use examples of people already in your team to determine cultural fit and what backgrounds result in success. Much like the AI this will likely lead to a reliance on exiting archetypes that could negatively affect your companies plans for increased diversity.
So, what do we do now?
AI will doubtless have its role to play in screening candidates, particularly with large corporates receiving 10,000s of applicants. However, it is important we don’t lose the human element in the process. Unlike machines we understand the importance of diversity in the workplace, we know why it matters and can be motivated to change it.
Diversity gives us strength and encourages new ways of thinking in teams to help us innovate and get better. What we need to do now is make sure we keep that goal in mind when screening candidates and understand what truly matters for someone to be successful. Not their name, university, age or situation but their knowledge, ideas, tenacity and passion.