[ad_1]
Job seekers can must now impress not only future bosses, but artificial intelligence algorithms too – as employers screen candidates by asking them to answer interview questions on a video that is then evaluated by a machine.
HireView, a leading provider of candidate screening software algorithmic evaluation, said Tuesday that it kills a controversial feature of its software: analyzing a person’s facial expressions in a video to discern certain characteristics.
Job seekers selected by HireVue sit in front of a webcam and answer questions. Their behavior, intonation and speech are fed by an algorithm that assigns certain traits and qualities.
HireVue claims that an “algorithmic audit” of its software carried out last year shows that it is free from bias. But the nonprofit electronic privacy clearinghouse filed a complaint against the company with the Federal Trade Commission in 2019.
HireVue CEO Kevin Parker acknowledges that public outcry over the use of software to analyze facial expressions in the video was part of the math. “It added value for customers, but it wasn’t worth it,” he says.
The algorithmic audit was carried out by an external firm, O’Neil Risk Consulting and Algorithmic Audit. The company did not respond to requests for comment.
Alex Engler, a member of the Brookings Institution who has studied AI recruiting, says the idea of ​​using AI to determine someone’s ability, whether it’s based on video, audio, or the text, is far-fetched. He says it is also problematic that the public cannot verify such claims.
“There are things that machine learning can probably help with, but fully automated interviews, where you make inferences about job performance – that’s terrible,” he says. “Modern artificial intelligence cannot make these inferences.”
HireVue says about 100 companies, including GE, Unilever, Delta and Hilton, use its technology. The software requires job applicants to answer a series of questions in a recorded video. The company’s software then analyzes various characteristics including the language they use, their speech and, so far, their facial expressions. It then provides an assessment of the candidate’s suitability for a job, as well as a measure of traits such as “reliability”, “emotional intelligence” and “cognitive ability”.
Parker says the company helped screen more than 6 million videos last year, although sometimes that simply involved transcribing responses for an interviewer rather than performing an automated candidate assessment. He adds that some clients let applicants opt out of automated screening. And he says HireVue has developed ways to avoid penalizing applicants with spotty internet connections, by automatically referring them to a human.
AI experts warn that algorithms trained on data from previous job applicants can perpetuate prejudices that exist when hiring. Lindsey Zuloaga, chief data scientist at HireVue, says the company researches gender, race and age biases by collecting this information in training data and looking for signs of bias.
But she recognizes that it can be more difficult to know if the system is biased on factors such as income or education level, or if it could be affected by something like stuttering.
“I’m surprised they are abandoning this because it was a key feature of the product they were marketing,” says John Davisson, senior counsel at EPIC. “It’s the source of a lot of concerns about collecting biometric data, as well as these bold claims about the ability to measure psychological traits, emotional intelligence, social attitudes and things like that.
The use of facial analysis to determine emotional or personality traits is controversial; some experts warn that the underlying science is imperfect.
Lisa Feldman Barrett, a Northeastern University professor who studies emotion analysis, says a person’s face alone does not reveal emotion or character. “Just by watching someone smile, you can’t really say anything about them, except maybe they have great teeth,” she said. “It is a bad idea to make psychological inferences, and therefore to determine people’s results, based solely on facial data.”
[ad_2]