Engineering leaders who are struggling to meet hiring goals or strategic diversity, equity, and inclusion (DEI) initiatives should take a closer look at their screening processes. Chances are, they will find seemingly innocuous practices that are systematically excluding qualified candidates from entering their pipeline.
Software companies often remove the human element from the early stages of recruiting to optimize processes and potentially remove human biases. Some screening systems cause more harm than good, however, and do so with disastrous efficiency. That’s because screening tools powered by artificial intelligence can systematically reject great swaths of potential hires by amplifying inequitable criteria baked into their algorithms.
For example, AI codifying that gives a preference to candidates from a top-10 computer science school may stem from a belief that graduates of those schools are typically competent engineers. In practice, however, preference for those candidates is a mark against graduates of other schools who may be equally or more qualified.
Pedigree bias can also undermine DEI efforts. Recruiting practices that rank schools as either Tier 1 or Tier 2 may be relegating historically underutilized Black colleges and universities (HBCUs) to the less-favored group. But because HBCUs award more than 35 percent of bachelor’s degrees earned by Black students in computer science in the United States, this recruiting practice can exclude highly qualified candidates without a single interview.
Four steps to root out bias
Here are four steps to root out biases lurking in your organization’s recruiting practices and expand your talent pipeline to a wider and more diverse pool.
1. Re-evaluate candidate evaluations.
As I described above, unintentional bias can begin excluding qualified individuals from the earliest stage of the recruiting process. If a company uses AI in resume screening, that process needs to reinforce an organization’s commitments to diversity, equity, and inclusion, but only if diverse voices contribute to its improvement.
Analyze the language used in your job descriptions and testing criteria. Nontraditional applicants may not have industry experience or training at Tier 1 computer science schools, unlike traditional applicants. Recast ambiguous language and jargon that could make potential applicants feel they would not fit in with the organization’s culture. Tools like Textio help strengthen and improve your job descriptions to provide inclusive language to attract diverse applicants.
For tech and software companies that heavily rely on coding tests, they should refrain from setting what could be considered an “acceptable score,” which may be an unrealistic expectation. Be sure to measure candidates’ performance with levels that have produced successful hires in the past. At Karat, our data shows that requiring absolute correctness and completeness in testing eliminates many qualified and diverse candidates, while nearly half of job offers by companies that conduct live technical interviews go to candidates who submitted incomplete solutions.
2. Widen the talent pipeline.
I mentioned that HBCUs award more than a third of the computer science bachelor’s degrees earned by Black graduates in the US. These institutions make up just 3 percent of the nation’s schools and universities but represent a high concentration of engineering talent employers could include in recruiting efforts. Cultivating relationships with organizations such as the National Society of Black Engineers (NSBE) can help to reach potential applicants from schools throughout the nation.
This is an excellent time to expand your geographic reach because many schools and organizations that normally host conventions and job fairs (the NSBE included) have made those events virtual due to the pandemic. Look for those that serve underrepresented groups to increase talent pipeline diversity; QurantineCon, for example, is a culture-driven career fair for diverse communities.
3. Offer process transparency.
Once you begin recruiting for more diverse candidates, it’s critical that they’re being set up for success in the hiring process. A candidate who has no connection to the big-tech world isn’t going to have the same knowledge of the hiring process or interview questions as a candidate who has a family member working at your company.
Programs like Brilliant Black Minds are working to close the access gap at the industry level by ensuring that underrepresented candidates have the same understanding of the interview and hiring process as your director of engineering’s nephew.
Your organization can help level the playing field by providing applicants with sample interview questions or by posting a recording of a mock interview to the company’s website alongside posted job openings. This allows underrepresented candidates to prepare and deliver a clearer aptitude signal during the interview.
4. Expand your direct-applicant pipeline.
Employers tend to grant interviews more often to the candidates they have sought out through recruiting than to those who applied directly for an open position. Much more often, in fact – Karat’s internal data, which includes more than 80,000 technical interviews, show less than 10 percent of direct applicants receive an interview.
Ironically, direct applicants in the same data often outperformed the recruited candidates in job interviews. Additionally, we have found that the closing or hiring rate for direct applicants is 10 percent higher than for recruited candidates. It is easy to conclude that organizations are screening out too many direct applicants without allowing them to demonstrate their skills in an interview.
Give more interviews to direct applicants by adjusting the pass-through rates in technical screening. A good place to start would be to let through 20 percent more direct applicants. And because the direct applicant pool is less likely to match pedigree requirements, hires from this group can help to break the cycle of pedigree hiring and develop a more inclusive and diverse workforce.
Unpacking bias can be complicated and multi-layered, mainly because humans have inherent historical behaviors that have been accepted as the norm. If companies want to utilize AI to remove bias and therefore foster diversity, they have to re-think the process itself, but also train those who participate in it as human safeguards along with technology.