Deep-rooted: recruiters’ ingrained prejudices can hold back women and ethnic minority applicants © Getty

Even computer programs designed to eliminate human subjectivity sometimes fail. Bias in the hiring process is prevalent and hard to eradicate.

Scientists have found evidence that data-driven recruitment algorithms have the potential to learn our prejudices. Unless carefully monitored, their filters detect patterns of underrepresentation and reproduce them, perpetuating biases against already disadvantaged groups such as older and disabled workers, women and ethnic minorities.

The deep-rooted nature of human bias has led some academics and consultants to question the value of diversity training that pins its hopes on educating people to rise above their prejudices. In her book What Works, Professor Iris Bohnet, a behavioural economist at Harvard University, argues that rather than trying to alter people, employers should change their processes to limit the opportunity for bias.

One company experimenting with different ways of making its processes and practices bias-proof is Vodafone, the telecoms group. Catalina Schveninger, global head of resourcing, highlights a pilot the company is running, initially in India, to test the effect of removing gender from the CVs of job applicants.

Historically local managers had assumed they were failing to appoint women into tech roles because there were not enough qualified women to recruit. But the data told another story. Plenty of highly qualified women were applying but were not getting interviews. “If the [Indian] pilot shows that by [gender] blinding CVs we can move the needle then we will share the results and encourage other markets to adopt the same practice,” says Ms Schveninger.

At professional services firm KPMG, crunching the numbers on internal promotions revealed that proportionately more men than women were being promoted to senior roles. This produced a gender imbalance that worsened with each step up.

However, this was not simply a matter of male bosses appointing men in their own image. There was also a gender dynamic. “Where the men would apply for a role if they had 80 per cent of the [required] skills, women would think they were missing 20 per cent and not bother,” says Martin Blackburn, people director at KPMG UK. Now when a promotion is advertised, line managers are encouraged to check whether their high-potential female colleagues have applied, and if not ask why.

The analysis highlighted that men holding job offers from competitors were more likely than women to ask for and receive a financial bonus to prevent them leaving, and this was contributing to a gender pay gap. Rather than offering money, line managers are now expected to offer career development — and if that does not work, to let people go.

Trying to limit bias with hard-hitting training can have the opposite effect. Studies suggest that if people feel coerced to change their opinions, their biases may be more entrenched and it could provoke a backlash.

At advertising agency Dentsu Aegis Network — whose Japanese parent company is being investigated by the government following an overworked employee’s suicide in 2015 — managers have been taking part in a form of bias training that tries to engage people rather than blame them. Watched by an audience, actors recreate stories of workplace bias gathered from the business. As the scene unfolds, the onlookers are invited to discuss what is going on and then rewrite the script to produce a better outcome.

“The point isn’t to get people to accept that they have biases, but to get them to see [for themselves] that those biases have negative consequences for others,” says Theresa McHenry, HR director at Microsoft UK, which also uses the technique. The idea is that by teaching people decision-making disciplines — called “bias interrupters” — they will be better equipped to counter the brain’s tendency to fall back on the known and familiar when making choices.

Developments in supercomputing are enabling other methods of countering human blind spots and social disadvantage. At Vodafone, recruiters run job postings through Textio, a tool that sniffs out corporate jargon and words — such as “competitive” and “drive” — that research suggests can put off female applicants.

The business is also trialling Headstart, an algorithm-driven recruitment platform, which matches graduates to job opportunities based on psychometrics and an analysis of mutual needs that does not hinge on which university the candidate attended. The technology contains some optional features that allow employers to prioritise applications from under-represented groups.

Though the trial is still at an early stage, Ms Schveninger says she is starting to see promising candidates coming forward for interview from universities that were not previously on Vodafone’s radar.

Video interviews: Broadening the candidate pool

BDO UK, the professional services firm, recently decided to replace face-to-face interviews for graduates and school leavers with recorded video in an attempt to attract a more diverse pool of candidates.

Shan Nelson, head of early careers at BDO, says technology has allowed the company to reach poorer students who work part-time and would struggle to find the money and time to travel for an interview.

“A student can [record] an interview whenever they choose, without leaving their room,” she says. Only candidates who are invited to an assessment day in the final stages of selection need to travel.

A report by the CIPD, the UK HR body, found that interviews are poor predictors of performance because they vary so much. Having all candidates answer the same set of pre-recorded questions goes some way to addressing this problem. “However much the [interviewer] is reading off a script, their energy, the way they respond to the candidate is going to affect the [performance of] the candidate,” says Will Hamilton, founder of LaunchPad Recruits, which provides video-recruitment technology.

When viewing a candidate on screen, physical attractiveness, age, personality traits and social class can all colour a reviewer’s judgment — as they can in face-to-face interviews.

LaunchPad is working on a technology to help employers spot promising applicants using predictive markers derived from analysing the videoed responses of candidates who went on to become top employees. However, there are hurdles to overcome. “If you’ve only hired white men previously all that any predictive model [of high potential] will do is pull out white men,” Mr Hamilton says.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments