As firms increasingly rely on artificial intelligence-driven hiring platforms, many highly qualified candidates are finding themselves on the cutting room floor.
Body-language analysis. Vocal assessments. Gamified tests. CV scanners. These are some of the tools companies use to screen candidates with artificial intelligence recruiting software. Job applicants face these machine prompts – and AI decides whether they are a good match or fall short.
Businesses are increasingly relying on them. A late-2023 IBM survey of more than 8,500 global IT professionals showed 42% of companies were using AI screening "to improve recruiting and human resources". Another 40% of respondents were considering integrating the technology.
Many leaders across the corporate world hoped AI recruiting tech would end biases in the hiring process. Yet in some cases, the opposite is happening. Some experts say these tools are inaccurately screening some of the most qualified job applicants – and concerns are growing the software may be excising the best candidates.
"We haven't seen a whole lot of evidence that there's no bias here… or that the tool picks out the most qualified candidates," says Hilke Schellmann, US-based author of the Algorithm: How AI Can Hijack Your Career and Steal Your Future, and an assistant professor of journalism at New York University. She believes the biggest risk such software poses to jobs is not machines taking workers' positions, as is often feared – but rather preventing them from getting a role at all.
Some qualified job candidates have already found themselves at odds with these hiring platforms.
In one high-profile case in 2020, UK-based make-up artist Anthea Mairoudhiou said her company told her to re-apply for her role after being furloughed during the pandemic. She was evaluated both based on past performance and via an AI-screening programme, HireVue. She says she ranked well in the skills evaluation – but after the AI tool scored her body language poorly, she was out of a job for good. (HireVue, the firm in question, removed its facial analysis function in 2021.) Other workers have filed complaints against similar platforms, says Schellmann.
She adds job candidates rarely ever know if these tools are the sole reason companies reject them – by and large, the software doesn't tell users how they've been evaluated. Yet she says there are many glaring examples of systemic flaws.
In one case, one user who'd been screened out submitted the same application but tweaked the birthdate to make themselves younger. With this change, they landed an interview. At another company, an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed "baseball" or "basketball" – hobbies that were linked to more successful staff, often men. Those who mentioned "softball" – typically women – were downgraded.
Marginalised groups often "fall through the cracks, because they have different hobbies, they went to different schools", says Schellmann.
In some cases, biased selection criteria is clear – like ageism or sexism – but in others, it is opaque. In her research, Schellmann applied to a call centre job, to be screened by AI. Then, she logged in from the employer's side. She'd received a high rating in the interview, despite speaking nonsense German when she was supposed to be speaking English, but received a poor rating for her actual relevant credentials on her LinkedIn profile.
She worries the negative effects will spread as the technology does. "One biased human hiring manager can harm a lot of people in a year, and that's not great," she says. "But an algorithm that is maybe used in all incoming applications at a large company… that could harm hundreds of thousands of applicants."
"The problem [is] no-one knows exactly where the harm is," she explains. And, given that companies have saved money by replacing human HR staff with AI – which can process piles of resumes in a fraction of the time – she believes firms may have little motivation to interrogate kinks in the machine.
From her research, Schellmann is also concerned screening-software companies are "rushing" underdeveloped, even flawed products to market to cash in on demand. "Vendors are not going to come out publicly and say our tool didn't work, or it was harmful to people", and companies who have used them remain "afraid that there's going to be a gigantic class action lawsuit against them".
It's important to get this tech right, says Sandra Wachter, professor of technology and regulation at the University of Oxford's Internet Institute.
"Having AI that is unbiased and fair is not only the ethical and legally necessary thing to do, it is also something that makes a company more profitable," she says. "There is a very clear opportunity to allow AI to be applied in a way so it makes fairer, and more equitable decisions that are based on merit and that also increase the bottom line of a company."
Wachter is working to help companies identify bias through the co-creation of the Conditional Demographic Disparity test, a publicly available tool which "acts as an alarm system that notifies you if your algorithm is biased. It then gives you the opportunity to figure out which [hiring] decision criteria cause this inequality and allows you to make adjustments to make your system fairer and more accurate", she says. Since its development in 2020, Amazon and IBM are among the businesses that have implemented it.
Schellmann, meanwhile, is calling for industry-wide "guardrails and regulation" from governments or non-profits to ensure current problems do not persist. If there is no intervention now, she fears AI could make the workplace of the future more unequal than before.
We explore how technology is reviving the renowned fiction writer's legacy.
Find out how Sweden, one of the world's biggest truck manufacturers, is trying to make the switch to electric.
Two voice-over artists were listening to a podcast when they heard their own stolen AI-generated voices.
AI v The Mind: We explore the world of food and ask if human expertise is the only way to deliver great flavour.
A look through footage from the Paris 1924 Olympics gives viewers a chance to reflect on how much has changed.
A project using AI and sports science identifies potential future athletes from the crowds of Olympic fans.
In a clash of wit and technology, we bring together two unlikely adversaries: a comedian and an AI chatbot.
Filmmaker Gary Hustwit has created a documentary which can rewrite itself before every screening.
Two decades after Dolly the sheep, cloning technology has moved on so much that you can now clone your pets.
In a new series, we will test the limits of the latest AI technology by pitting it against human experts.
Some politicians have found themselves victims of deepfakes. Can the public trust politicians in the age of AI?
We show the latest technology from concept “crab driving” to cars adjusting music based on speed.
What is behind the cryptocurrency's sudden rise and can it last?
The BBC visits the Indy Autonomous Challenge to find out about the use of Artificial Intelligence in racing.
The BBC's Katty Kay has spoken to successful change-makers to ask what made them pivot in their career.
Today, the beer industry is male-dominated, but it wasn't always so.
Expert advice on how to get through a waiting period.
A career coach, a choreographer, a chef and a dragon boat captain offer advice on giving critical feedback.
These are the long lost words you've been waiting for to describe your colleagues.
Veterinarian professionals are committing suicide at a higher rate than the general population.
There are more than 400 sites in the US National Park System. Here's why it's time to skip the most-visited top 10 and seek out quieter, equally spectacular ground.
In the early 1990s, eight people tried to survive in a hermetically sealed glass structure filled with miniature forests, oceans and savannah.
From a nine-month trek to a 20,000-mile motorcycle odyssey, these books will transport you across continents encourage you to see the world differently.
Returning to cinemas next week, the superhero may be known as the ultimate all-American Mr Nice Guy – but, back in the 1930s, he didn't begin that way.
Human testicles are much smaller, in proportion, to some of our primate cousins. Evolution can tell us why. But the size of other body parts is a little bit more of a mystery.
Copyright 2025 BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking.
AI hiring tools may be filtering out the best job applicants – BBC
