Candidate matching: will AI transform how we hire?

Published

Author

Read time
11 mins

Shortlisting qualified applicants for roles is always time consuming. AI promises to match candidates to roles with greater speed and efficacy than humans do

Unless you have the luxury of recruiting for a niche role that has attracted just a few, highly qualified candidates, coming up with a shortlist can be a time-consuming task. Recruiters in organisations undertaking high-volume hiring campaigns – for example, cohorts of graduates or staffing a new contact centre facility – face a mountain of potentially thousands of applicants, and unsurprisingly have turned to technology to help them sort the good candidates from the unsuitable ones.

Candidate matching has become increasingly sophisticated in recent years and is one of the most common applications of AI in HR and recruitment. Dozens of software companies have emerged to sell tools for every step in the hiring process – from tech that matches job postings with likely candidates to tools that scan applications. There are also tools that can pick up keywords in CVs or applications, and those that ‘score’ candidates against a set of criteria for the role or deselect those who don’t have the required qualifications. All claim to do so in a matter of minutes compared to days working through applications manually.

Providers of such systems claim they can significantly reduce recruiter workload, improve time to hire and save organisations money because technology can automate much of the sifting process. “While human judgement will likely remain more reliable, there are parts of the hiring process that can be automated. For hiring managers, the biggest bottleneck is having to sift through endless unsuitable CVs,” says Marja Verbon, co-founder of careers site Jump. Its preselection process checks candidates against a list of criteria before they apply. Only an estimated 12% meet those criteria, meaning screening time is potentially reduced by as much as 88%.

Sound too good to be true? AI’s growing role in matching candidates to jobs does have its limitations. At a basic level, most algorithms will search based on keywords in CVs – so hiring managers could be missing out on the bigger picture of what a candidate has to offer. “Candidate matching has got a lot better,” explains Neil Armstrong, commercial director at onboarding and ATS recruitment software company Tribepad. “But a CV is not necessarily a good way of understanding someone’s capabilities. You can get a view of someone’s experience but no insight into their personality or potential. And people have learnt to game the system, by putting in certain buzzwords that an AI search will pick up. They get found out later in the process because they don’t have the skills, and it’s a waste of time for them and the recruiter.”

One of the key concerns around the use of AI in candidate screening is its potential to introduce or reinforce bias. While on the one hand it can reduce humans’ unconscious biases by automating the sifting process, the algorithms on which this automation is based can in themselves be problematic. Often, employers are using historical datasets against which to score applicants, meaning they could be stacking the odds against underrepresented groups including women or black, Asian and minority ethnic candidates. Furthermore, certain keywords may work against certain groups. For example, if you ask the algorithm to identify applicants who play a certain sport or have a particular behavioural attribute, this could inadvertently exclude a sizable group of suitable candidates. Outside of recruitment, biased algorithms used by the US courts have been shown to mistakenly label black defendants as having twice the potential to reoffend as their white counterparts.

Kim Nilsson, co-founder of Pivigo, a data science recruitment specialist, explains: “You would hope that as [algorithms] get more sophisticated that they will remove bias, rather than add to it, but the slightly scarier question is: as algorithms get more embedded or prevalent, how big a risk is bias to the process?” This is because algorithms learn from ‘training data’ based on past success, she adds. “In HR, a training dataset would be, for example, a set of CVs from previous [candidate] applications with labels of which ones had successful job offers. The problem here is that if you feed this sort of data to the algorithm, it may say that someone with a different profile from your current workforce (such as gender, nationality, or educational background) should be rejected because your previous ‘success’ cases do not have this diversity in it. You will perpetuate the biases that already exist in your workforce and data.”

“Greater diversity in algorithm development teams could help mitigate unintentional bias into system design,” says Nimmi Patel, policy manager for skills, talent and diversity at UK industry body techUK. “Having diverse teams – in terms of gender, ethnicity, experience, and background – will increase the likelihood of unconscious biases being recognised and addressed, rather than encoded into future algorithmic- decision making systems. This will, in turn, improve the quality of the decisions being made.”

Gareth Jones, CEO of Headstart, an AI system for graduate and early career recruitment, agrees that the data going into the system needs to be improved if AI is to deliver on its promises. “AI offers the opportunity to make hiring fairer, but we’re still a long way from hiring being data-driven,” he says. “Recruiters still end up looking at a CV or a profile. They may supplement that with assessment, but it’s not driven by what the success criteria are in that company. That should be the starting point – not a job description.”

That said, there are ways to make the most of AI by combining it with other tools to predict good hiring outcomes more accurately. Introducing assessment into the process can be a means of building a more rounded and objective view of the candidate. “These can be integrated into a company’s ATS at the start of the hiring process, after the initial candidate application,” explains Chris Platts, CEO and co-founder of ThriveMap, a pre-hire assessment company. ThriveMap’s assessments take candidates through a digital ‘day in the life’ experience of a job. “Candidates are automatically invited to complete their ‘virtual shift’ via email or text message and the results and interview reports are sent directly into the ATS,” he adds. “This real-world approach to talent assessment means that candidates can get a feel for the role and culture ahead of joining the company; they can even opt-out of the hiring process if they feel it’s not right for them.”

Candidates deselecting themselves from the process is actually a positive outcome for hiring managers. It ensures those moving to the next stage identify with the organisation’s culture and are more likely to be productive and engaged if they’re successful in getting the job. Data and feedback provided by assessments during the hiring process, furthermore, can help demonstrate to eager but unsuccessful candidates why they did not meet all the criteria and how they could boost their skills when applying for another role – or even suggest a more suitable one at the same organisation. “Context-specific tests can showcase the role is real. So if you’re looking at a PA role, here’s an enquiry we’d like you to deal with, rather than asking for generic skills such as Excel,” adds Adrian McDonagh, co-founder of hireful, a company that supports recruiters to improve their hiring practices. “Then you can point to the assessment and show why you made the decision rather than just saying ‘we found a better candidate’.”

Richer assessments can also show recruiters applicants with a diversity of values and thinking styles. A 2013 report by Deloitte recommended organisations hire with ‘diversity of thought’ in mind in order to protect against groupthink and promote new ways of solving problems. One way of doing this is through game-based recruitment tasks or challenges, which can show not only whether candidates have job-related skills but also how they reach decisions – an important consideration when looking to build a diverse team. British intelligence agency GCHQ has run multiple campaigns – aimed at hiring people with cyber skills – which have challenged candidates to crack codes or decipher messages, in line with its requirement for applicants to ‘think like a hacker’.

Employers such as Lloyds Banking Group and Police Now, meanwhile, have used virtual reality challenges for graduate recruitment where candidates complete simulations and recruiters can see how they navigate them. Some game-based hiring challenges go as far as to offer a prize: Google’s Code Jam, where programmers compete to show off their coding skills, offers a reward of $15,000 for the winner (and a solid shortlist of candidates with proven coding skills for Google).

Knowing more about how the candidate operates in a realistic environment and whether they might make a good fit with the team makes the onboarding and settling-in process a lot smoother, adds Platts. “Identifying candidates in your applicant pool who align with your company’s desired behaviours, who can demonstrate the required role capabilities, and have a genuine commitment to want to do the job is difficult,” he says. “Making better hiring decisions means that candidates become productive more quickly. If assessments take candidates through a digital experience of the job, they’re less likely to be surprised by what the job involves when they start and leave prematurely.” Effective use of algorithms can mean organisations can better target their recruitment marketing.

Verbon adds: “On the professional side, job recommendations up until now have been extremely hit or miss. We’ve all had our fair share of utterly irrelevant job alert emails. What algorithms can help with is identifying which jobs would be a good match based on the professional’s CV, rather than just random keywords. These recommendations are much more accurate and can make job hunting a far less daunting task.”

Ultimately though, AI and other automation tools should be used to support and augment what the human recruitment team does, rather than to replace it. “Used correctly, these systems can do wonders for improving diversity in an organisation, by promoting the applications of individuals who otherwise would not come through [the shortlisting process],” says Nilsson. “HR individuals will often scan hundreds of applications and will for obvious reasons not have time to study each application in detail.” They can also help to overcome the shortcomings of mechanisms such as referrals. “Recommendations may be a good indication that that individual could be a good hire, but it will miss out on stand-out talent who are not as well networked. So, algorithms can support the decision by highlighting applicants that would otherwise be hidden in the noise, without actually making the final decision,” she adds.

A start-up called BrightHire, backed by renowned organisational psychologist Adam Grant, offers to bridge this gap. A professor at the Wharton School of Business, Grant’s book Originals: How Non-Conformists Move the World, looks at why – in the workplace – it’s not always conventional qualities or actions that make people good at their job. BrightHire’s tool, rather than simply automating decisions about who makes the cut, provides context to ‘flesh-and-blood’ interviewers during and after video calls with candidates. The software offers an ‘interview assistant’ that keeps interviewers on track, displaying the predetermined questions they’ll need to ask each candidate. By asking a standard set of questions, the interviewer is less likely to make subjective evaluations based on their own biases, while the process is rendered more efficient through BrightHire’s prompts. Pre-hire assessments are becoming more sophisticated, too, including neuroscience-based tools that can predict whether a candidate is likely to be a good cultural and skills fit.

Faced with an overwhelming volume of applicants, the promise of speeding up the matching process is an attractive one –– whether at the start of the candidate search or when you’re inviting people to interview. A dizzying array of technology is available to support HR and recruitment teams in finding, matching and shortlisting candidates – and while this can’t replace the human touch, it can make the process more evidence-based, efficient, and cost-effective.

Candidate matching: five key takeaways

  1. Recruiters can now choose from a huge selection of tools that can support candidate matching and selection
  2. AI and algorithms can save time and money for hiring managers, but should augment human decisions rather than replace them
  3. Be aware of the potential to build bias into recruitment algorithms
  4. Technology can build a more reliable picture of a candidate’s suitability, making them more likely to be a worthwhile hire
  5. Establishing someone’s suitability for the day-to-day aspects of a role in the hiring stage can boost retention and productivity

 

This is an extract from Good Work, Great Technology: Enabling strategic success through digital tools, published by leading UK HR software provider Ciphr.