Think there’s no bias in your hiring process? AI says think again

When Jahanzaib Ansari was looking for work in 2016, his resume was not the problem. Despite a CV boasting experience as a programmer and attending the University of Toronto, Ansari’s job search soon hit a dead end. At the suggestion of a friend, he changed his first name on his resume and saw almost immediate results.

- Advertisement -

“I wouldn’t hear back from employers until my [colleague] said, ‘Why don’t you just Anglicize it?’ I went with variations of Jason, Jordan, Jacob, and literally in four to six weeks, I got a job,” says the CEO of Knockri, a technology firm that created an artificial intelligence tool that aims to reduce bias in the hiring process. Ansari’s experience led him and some friends to start the Toronto-based AI company to help companies recognize their inherent blind spots when hiring new talent. “It’s an unconscious bias that has been built [into society] for several decades,” he says.

And it’s a type of discrimination in the hiring process that members of the African-American community, those with Jewish names or those like him, with Muslim names, have all experienced, Ansari says.

Related: How AI and HR Tech are taking on DE&I and inherent biases

As organizations increasingly address diversity, equity and inclusion in the aftermath of the widespread racial unrest of 2020, they have found that deficiencies in all three areas converge at a central point: the recruitment and hiring process when human beings are reviewing the qualifications of job applicants. To address this, recruitment and hiring leaders are beginning to adopt AI-powered solutions in hopes of reducing inherent biases when creating job postings, gathering and reviewing resumes, interviewing candidates and choosing the best candidate to hire.

These programs often scrub applications of information that may hint at an applicant’s gender and religious and ethnic background. Some also remove dates from the resumes–such as college graduation years–so as not to reveal the job seeker’s age. Instead of reading resumes, the organization’s hiring team will receive AI-generated reports summarizing applicants’ skillsets, previous job experience and educational background.

- Advertisement -

According to Doug Leonard, CEO of Clovers, an AI-based bias reduction tool, this type of technology can boost fairness for applicants and provide internal insight into biases.

“AI in the hiring process can uncover bias both during interviews and afterward during candidate evaluation,” he says.

The Clovers solution, for example, features a tool that creates a highlight reel of each Zoom, Microsoft Teams, WebEx or Google Meet interview. These reels then serve a dual purpose: a way to quickly and easily share highlights of key candidate moments with hiring managers and others and a simple method for reviewing interviews internally for possible patterns of bias at the individual, team and organizational level, Leonard says.

Future versions of Clovers will include AI-driven, real-time alerts during interviews that can flag when an illegal or biased question is asked and send a notification to the interviewer that there may be a compliance risk.

This ability of AI to highlight recruitment managers’ blind spots can be a powerful way to boost DEI, says Somen Mondal, GM of talent intelligence at Ceridian. “Let’s say 50% women and 50% men applied to a position but only 5% of women made it to the interview stage,” he says. “If all things are equal, and with the right numbers, there is an inequity in the process. [AI] helps us uncover potential areas of weakness.”

Several AI anti-bias tools skip over candidates’ names to concentrate on their skills sets and work experience. Knockri, for example, provides candidates with a behavioral skills assessment and audits an organization’s skills framework and automates the questions the hiring director wants to ask. Knockri relies on proprietary data for different roles with correlations to skills and success predictors.

Its video-based interview questions are automated and do not rely on a live recruitment or HR manager. The interview consists of a video assessment, a written assessment and an audio evaluation to measure the candidate’s skills. The applicant’s answers are then converted to text to analyze the context, behaviors and relevancy to the skills that they’re being assessed for, says Knockri’s Ansari.

The tool has helped Canada’s Department of Defense boost its number of female candidates and those with disabilities by 20%, Ansari says.

Bias is not the only thing that these AI tools aim to reduce. Eventually, some AI solution experts predict that these tools may kill the paper and electronic resume once and for all. Instead of relying on historical data such as resumes and CVs that are “riddled with bias,” according to Plum CEO Caitlin MacGregor, organizations can turn to solutions like Plum to measure human potential data, which measures a candidate’s ability to fulfill a specific job role.

“This allows you to look beyond credentials, degrees and past job titles to discover what candidates and employees are truly capable of achieving if given the opportunity,” she says.

Related: How Whirlpool turned the tide to find the right people 

Plum client Scotiabank, for example, has adopted a “resume-less” recruiting process, which starts with candidates completing a “Plum Profile” before a talent recruiter eventually assesses the candidate’s talents (or what Plum calls “human potential”). Doing this at the beginning of the recruiting phase can help remove potential bias in the selection process such as race, age, gender or which school candidates attended, says MacGregor.

The source of bias in hiring systems

Many organizations that fall short of hiring a diverse workforce assume that their problem is at the “top of the funnel,” because they aren’t finding the candidates to fill open roles, explains Emerson. “Often, when they see their data and our expert insights, they learn they might have an even bigger gap in their hiring practices, which may be favoring candidates from majority groups,” she says.

This revelation that companies are riddled with inherent bias was highlighted years ago when Amazon engineers created an AI tool to hire new programmers and technologists and the tool quickly rejected women outright because it was designed by white men.

Amazon used a so-called “black box AI approach,” where the AI engine was fed datasets of predominantly white men and, based on that, the engine discarded all female candidates, setting a dangerous precedent for different AI approaches to hiring, according to Sunny Saurabh, co-founder and CEO at Interviewer.AI.

“The goal is to increase efficiency and productivity for recruiters and hiring managers by making data–and not gut feel and instincts–the basis of any hiring decisions while ensuring reduced inherent bias, and constantly curating and training our datasets for machine bias,” Saurabh says.

AI recruitment and beyond

Today’s AI bias technologists are also exploring ways to extend the use of their products beyond the recruitment process. The teams that designed these tools claim they can provide employee promotion and retention analysis as well. Some AI tools, for instance, can ask if an organization is promoting diversity in a way that is fair and equitable, says Matt Gotchy, vice president of marketing for Trusaic, an AI bias tool provider. “It can also be used to see if employees are leaving the organization at an inequitable rate and in a non-diverse way,” he says.

Likewise, Plum also can be used throughout the employee lifecycle for the individual’s career development by identifying other roles and opportunities where the employee might thrive. This can help companies mobilize talent and build a diverse leadership pipeline quickly, MacGregor says.

And ultimately, while DEI efforts are noble within their own rights, experts say fighting bias with artificial intelligence also can make an important impact on an organization’s bottom line.

“We know that a gender and racially diverse organization outperforms its peers. We are seeing [anti-bias efforts] mandated from the C-suite, from chief diversity officers being installed and having this budgeted at the C level,” says Knockri’s Ansari. “This has a huge effect in terms of marketing campaigns, talent spend and their entire strategy.”

Phil Albinus
Phil Albinus
Phil Albinus is the former HR Tech Editor for HRE. He has been covering personal and business technology for 25 years and has served as editor and executive editor for a number of financial services, trading technology and employee benefits titles. He is a graduate of SUNY New Paltz and lives in the Hudson Valley with his audiologist wife and three adult children.