Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI IT

AI Is Being Used to Screen Job Applicants (bbc.com) 147

The BBC reports on "the computers rejecting your job application," noting that applicants are now being screened with AI-scored tests that involve counting dots in boxes and matching emotions to facial expressions: The questions, and your answers to them, are designed to evaluate several aspects of a jobseeker's personality and intelligence, such as your risk tolerance and how quickly you respond to situations. Or as Pymetrics puts it, "to fairly and accurately measure cognitive and emotional attributes in only 25 minutes".

Its AI software is now used in the initial recruitment processes of a number of multinational companies, such as McDonald's, bank JP Morgan, accountancy firm PWC, and food group Kraft Heinz. An interview with a human recruiter then follows if you pass. "It's about helping firms process a much wider pool [of applicants], and getting signals that someone will be successful in a job," says Pymetrics founder Frida Polli...

Another provider of AI recruitment software is Utah-based HireVue. Its AI system records videos of job applicants answering interview questions via their laptop's webcam and microphone. The audio of this is then converted into text, and an AI algorithm analyses it for key words, such as the use of "I" instead of "we" in response to questions about teamwork. The recruiting company can then choose to let HireVue's system reject candidates without having a human double-check, or have the candidate moved on for a video interview with an actual recruiter.

HireVue says that by September 2019 it had conducted a total of 12 million interviews, of which 20% were via the AI software. The remaining 80% were with a human interviewer on the other end of a video screen. The overall figure has now risen to 19 million, with the same percentage split. HireVue first started offering the AI interviews in 2016. Its users include travel services firm Sabre.

Meanwhile, a report from 2019 said that such is the growth in the use of AI that it will replace 16% of recruitment sector jobs before 2029.

This discussion has been archived. No new comments can be posted.

AI Is Being Used to Screen Job Applicants

Comments Filter:
  • I always knew this was only temporary: https://tech.slashdot.org/stor... [slashdot.org]

    • by Canberra1 ( 3475749 ) on Saturday February 13, 2021 @07:14PM (#61060836)
      Mediocre teams are made of mediocre people, normally without much cutting edge knowledge, or deep experience, deprived of brilliant leaders who address deficiencies. Knowing you were 'MacDonaled' in a hamburger interview process will distort getting top notch applicants, but also allow the snake oil streamliners to declare they picked the best and most attractive applicants. I hear of so many people getting coached, cheated, buzzword bingo, or over generalizations to sound good. My advice is to regularly audit code/work examples, even years after employment. Boeing and Intel are two companies who played the 'dumb down' cards, and are on course to reap the rewards. Hopefully diversity lawsuits will follow.
      • > Boeing and Intel are two companies who played the 'dumb down' cards, and are on course to reap the rewards. Hopefully diversity lawsuits will follow.

        What does that mean?

  • We all know how an army of thoughtless drones acts. Diversity, risk taking and innovation are the enemy. Are you within the margin of error of the tallest bar on the chart? You're in.
    • We all know how an army of thoughtless drones acts. Diversity, risk taking and innovation are the enemy. Are you within the margin of error of the tallest bar on the chart? You're in.

      Thank God

      • The audio of this is then converted into text, and an AI algorithm analyses it for key words, such as the use of "I" instead of "we" in response to questions about teamwork

        You might get the job, but they'll have the sort of meetings where people say "There's no 'I' in 'teamwork'".

        (To which one must reply "yes, and this team doesn't have a 'u' in it either.")

    • "Diversity, risk taking and innovation are the enemy"... for big companies that aren't interested in growth.

      These systems are great for small business and entrepreneurs in that regard.

      If people are looking for companies with growth mindsets, creative, willing to disrupt the market with cool new ideas, interview systems like this serve as an easy technique to avoid the bad ones, the ones who value stability over innovation.

      • To be clear, it serves as an easy red flag for workers. If a company wants you to submit a video interview in advance, especially if they want you to fit the norm, they are a company likely in decline that should be avoided.

        • It doesn't take a video interview.

          I've run across companies "operated" by equity funds/VCs that insist of "aptitude" tests that are really IQ tests timed at 15 seconds per question.

          They justify it as "all of our companies are required to do this"

        • To be clear, it serves as an easy red flag for workers. If a company wants you to submit a video interview in advance, especially if they want you to fit the norm, they are a company likely in decline that should be avoided.

          Almost every company wants you to fit their norm. The only question is what their norm happens to be.

          Their norm might well be "artsy 'rebel' who is just like all the other artsy 'rebels'".

    • Yeah, "The AI convicted you...." leading to you not getting your packages safely.

    • One one hand, turning a task like this over to computers is an incredibly bad, bad idea, on so many levels.

      On the other, as one of my favourite papers puts it, "The best and brightest don't go into HR."

      I've had interviews with 'Hiring specialists' where the job description included multiple languages. And the 'specialist' couldn't even tell me the language the solution *they* had was written in.

      "Diversity, risk taking and innovation are the enemy"
      Yeah, like 1984, Harrison Bergeron was meant as a warning, n

  • by Mal-2 ( 675116 ) on Saturday February 13, 2021 @06:15PM (#61060632) Homepage Journal

    So it's a Voight-Kampff test, only without the option of shooting the person performing the interview when they ask about your mother.

    • by AmiMoJo ( 196126 ) on Saturday February 13, 2021 @06:45PM (#61060728) Homepage Journal

      If you get asked to talk to an AI or suspect that's what is being used to evaluate you, decline. You don't want to work there if you have a choice.

      • Not everyone has that luxury, really. Also, I suspect that there's quite a few larger companies that have awesome jobs, but also an unimaginative HR dept. who would love to not have to ever look at another resumé again. It may be worth your while to game your way through that.
        They already were like that at the company I worked with 10 years ago; AI didn't exist in its current form so they just scanned keywords as a first filter. When I was involved in hiring new staff for a newly formed team, I ask
        • by tlhIngan ( 30335 )

          Not everyone has that luxury, really. Also, I suspect that there's quite a few larger companies that have awesome jobs, but also an unimaginative HR dept. who would love to not have to ever look at another resumé again. It may be worth your while to game your way through that.
          They already were like that at the company I worked with 10 years ago; AI didn't exist in its current form so they just scanned keywords as a first filter. When I was involved in hiring new staff for a newly formed team, I a

        • I knew a hiring manager who would take a stack of resumes and throw half of them in the trash without even looking at them.

          When asked why, he'd say "I don't want to work with unlucky people."

    • My mother... I'll tell you about my mother.
  • and it doesnt work (Score:5, Informative)

    by antus ( 6211764 ) on Saturday February 13, 2021 @06:17PM (#61060636)
    I have a friend who is a human recruiter at one of these companies, who personally looked in to a case of a relative who was rejected by an AI for an application. They found that the AI has missed vital information in the application that was clear to humans and had rejected them due to being uneducated even though the education was clearly sufficient and documented in the resume the AI had processed. The AI was just making low quality decisions and also rejecting good applicants. without humans to check it, nobody knows or cares.
    • How do you know it didn't work? Did the company never find an employee to fill the position? Probably they did. These systems don't need to be perfect.
      • They generally find people, but often they only find lower tiers.

        We all know the stories of experts denied jobs because they didn't answer the script properly, often giving answers like "it depends" or giving a cutting edge solution rather than the outdated old answer.

        • And that is a pretty low bar for these entry level jobs.

          The classic test case is AI's marking English exams. Sure they could be spoofed and give good marks to garbage and bad marks to deeply insightful work.

          But in a test, comparing it to actual exams marked by real, underpaid human markers, the AI did a lot better than the humans when the results were blind compared to the exams then remarked by "experts".

      • i know it didnt work because i have inside information that the rejection criteria clearly wrong to human eyes. so the AI failed at the screening task it was built to do.
        • by MrL0G1C ( 867445 )

          RightSaidFred was talking Orwellian BS, he was basically saying that a failure was a success because someone got hired.

        • Unless you saw ALL the applications and understand the goal, you don't know. Most places use the "AI" to screen out unqualified or bogus submissions and forward a list of Potentials to human recruiters/hiring managers. They are fully aware that there are some eligible people who get tossed, but as long as they were able to obtain a list of suitable applicants and hire someone it is a "success" to them.

          There are also situations where the job requirements are written so as to rule out external applicants ent

    • Comment removed based on user account deletion
      • Google offered me a position years ago, I said no, they blacklisted me from ever being hired.

        Fast forward to present times, an actual Google recruiter comes along and swears that doesn't matter.

        I went through *four* onerous interviews being gang-banged by multiple managers or whatever.

        When they scheduled a 5th one (!!) I said no more. Enough is enough, go fuck yourselves.

        I never called them back or responded. After about a month of "Hello, are you there?" emails from them, they gave up.

        Now I'm probably real

    • by AmiMoJo ( 196126 ) on Saturday February 13, 2021 @06:44PM (#61060720) Homepage Journal

      It's probably discriminatory against all kinds of disabilities. Black people too since we know how well webcams deal with their skin tones.

      There was a story a while back about students keyword spamming to get better grades with AI marking. I expect that will become the norm with AI interviews.

      • by Kaenneth ( 82978 )

        "matching emotions to facial expressions"

        Yep, if I got rejected after such a test, I could sue based on protected disability. It's like asking a potential employee woman if they plan to get pregnant, illegal.

    • by hey! ( 33014 ) on Saturday February 13, 2021 @06:44PM (#61060724) Homepage Journal

      This is a problem with many "AI" applications. They're a cheap and fast way to get to answers, although not necessarily *correct* answers.

      You can train an algorithm to give the same answers an expert would on a training set, but unless your real data can be guaranteed to come from the same sample population as the training data you have no idea that what the AI is telling you has any validity.

    • by thegarbz ( 1787294 ) on Saturday February 13, 2021 @07:17PM (#61060844)

      They found that the AI has missed vital information in the application that was clear to humans

      So performed as well as a traditional HR screening then?

      Jokes aside: we were hiring for an engineer a few years ago and the initial screening from HR presented us with no real great applicants. We asked to see the entire pool and after much arguing we eventually got all the resumes and found some excellent candidates which were rejected for the most stupid reasons, the dumbest of which was the HR drone did a word search for the word analyser in the submission instantly excluding all Americans or people who learnt American English.

      Honestly I think AI would be a step up in many cases.

      • Consolidating all the varied HR idiots into one big AI idiot won't be a net positive for job seekers. If everyone ends up using the same AI, and it doesn't like you for some reason, you're fucked. Now you can't shop around for a different idiot.

        It'll be like being blacklisted, but you know it will be worse because there's no way of getting around "the computer did it".

        • The purpose of the "AI" systems isn't to benefit the job seekers, it's to benefit the employer. And if what they want is a person with the minimum qualifications who is asking for the least pay, they do a pretty "good" job of providing that. If you're looking for the best candidate they tend to be much less reliable, but most of the time what is wanted is "good enough," not "best of the best."
      • I used to work as an intern for HR.

        In our case, if we had a resume for a job posting, we never screened it out, if we thought someone wasn't qualified, we simply placed the resume at the bottom of the pile (and we did the same digitally as well, this way the top candidates always appeared on the first screen of records when a hiring manager looked at them).

    • by rapjr ( 732628 )
      Emotion detection is not a solved problem. What kind of people were in the training data set? Were people from all cultures included? Did they include indigenous peoples, people with speech impediments, people with accents from across the world, people on all medications the US population is fed, people looking for a first job, and experienced people? Did they include negative examples like people from hate groups, racists, addicts, and psychopaths? Did they include ex-clergy, people who were rich and
      • What kind of people were in the training data set? random losers who were paid 5 bucks

        Were people from all cultures included? Of course not, that costs!

        Did they include indigenous peoples, people with speech impediments, people with accents from across the world, people on all medications the US population is fed, people looking for a first job, and experienced people? No, because sounds complicated

        Did they include negative examples like people from hate groups, racists, addicts, and psychopaths? Errr,

    • The AI was just making low quality decisions and also rejecting good applicants.

      That's teh nonsense of "hiring the best person for the job." There are like 4 billion people in the workplace. Even if you were the best company in the world that all 4 billion wanted to work at , odds that you can correctly identify the best candidates is minute. This is designed to remove large numbers of useless candidates, knowing there are errors.

      I say that as someone the AI may in fact discriminate against.

    • Some years ago I submitted a resume to a large high tech company for a position where I was reasonably appropriate. It was bounced by from their website in a couple of seconds - presumably because I hadn't used the specific combination of words that they wanted. I suppose that was no worse than having a front end office HR people do the same thing based on keywords. Still, given that there are many ways to describe the same skill set, it seems that using an AI may miss a lot of potential candidates.
  • AI ... matching emotions to facial expressions

    Great. Now one has to be a dual major, Computer Science and Theatre. You have to visually "project" confidence while puzzling out a problem. Looking confused, thinking, having an aha moment, and then solving the problem. That's so boomer. ;-)

  • ... use of "I" instead of "we" ...

    An interesting and IMHO, worthless demand for group-think and mediocrity: I mean "for teamwork and loyalty".

    The more these rules demand perfection and 'normalcy', the more people who become unemployed and unemployable.

  • give AI a real job (Score:4, Insightful)

    by OrangeTide ( 124937 ) on Saturday February 13, 2021 @06:36PM (#61060690) Homepage Journal

    AI can do the thing that almosy no HR department has the guts to do; write thoughtful and professional rejection letters. Most of the time you're left hanging with zero response.
    If we had much choice in where we worked, we'd want to work for a place that has enough respect for an applicate to not just ghost them.

  • Comment removed based on user account deletion
  • A recruiter asked me a question about using the "Redhat GUI" in Redhat Linux. I told her I knew the CLI answer to her question. That became a sticking point. Checklist asked specifically about "Redhat GUI" as if it was a specific thing. Not a general GUI question. I guess older versions of Redhat had a "Redhat GUI" package for installation. My resume wasn't submitted.
  • New Resume (Score:5, Insightful)

    by Thelasko ( 1196535 ) on Saturday February 13, 2021 @06:41PM (#61060712) Journal
    I think on my resume I will put that I don't hold a PhD or MS from MIT. I doubt the AI will comprehend that as a not statement, and put me in with the extremely qualified candidates.
    • Haha, I wonder if that would work?

      B.S. from Rensselaer Polytechnic in 1992
      No M.S. from Stanford in 1994
      No Ph.D. from California Institute of Technology in 1997
      Not a Postdoctoral Research Associate at University of California - Berkeley in 1998

    • "While I do not have a formal PhD or MS from MIT or Harvard, I do have many years of highly qualified experience in XYZ"
  • Half of the interview is the applicant deciding if they want to work there, at least in a healthy job market.

    I guess employers dispensing with that part tells us all we need to know about the health of the job market and their suitability as employers.

    • by Anne Thwacks ( 531696 ) on Sunday February 14, 2021 @08:56AM (#61062350)
      at least in a healthy job market.

      Currently the jobs market has Covid, and is still recovering from Trump Derangement Syndrome (or Brexit, if in the UK).

      You might want to try the traditional British HR system: shuffle and cut the CVs. Discard the bottom half, as they were unlucky, and no one wants to employ unlucky people.

      Repeat til down to seven or less applicants, and employ the one who seems to have used an appropriate MS Word template.

      This strategy avoids the need for HR to understand any buzzwords AND the need to distract the team from Getting Stuff Done.

      Sure the company will crash, but not until after the next quarter's results are published, and it will probably crash then whoever you hire.

  • Those smart enough to game the system and willing to cheat itdeserve to succeed in our modern dystopia. Conventional ethics are a liability and personal weakness only of value to exploitative others.
    AI might be used to analyze such systems to defeat them and arms races are fun.

  • Just one step closer to pre-crime...

  • Sorry, this seems to be too easily biased to be real. You're going to have to provide source code, therefore you're not gonna to make much money off of this. Do you want to continue?

  • by biggaijin ( 126513 ) on Saturday February 13, 2021 @07:22PM (#61060862)

    I was the employment and recruiting contact for my department at an internationally-famous corporate research laboratory for a couple years. The personnel people sent me about 300 resumes each week and i did my first sorting of them sitting next to a wastebasket. Most of the applications were so spectacularly unqualified for the work we were doing that it took only a second or two to reject them. Out of the weekly 300 resumes, I felt fortunate to find two or three that were worth pursuing. It would not be difficult to write a program to do this initial screening, and I am not surprised that someone has done it.

  • So-called 'AI' is not anywhere NEAR good enough at anything to be trusted with something like this.
    • by gweihir ( 88907 )

      So-called 'AI' is not anywhere NEAR good enough at anything to be trusted with something like this.

      I am not so sure. I have seen companies (including very large ones) where hiring decisions were universally exceptionally bad. Sure, "AI" will do a really bad job, but humans can do a _worse_ job.

  • Nobody would trust any HR to find a doctor for them, why does anyone trust people with zero skills about the jobs they are hiring. Given they also selected the AI questions and more, this adds to the joke that is HR.
  • That the HR people who revel in these new AI systems,
    including the ability to silently, legally, eliminate all the "non mainstream"
    candidates, including those who are older, ...
    will inevitably be fired based on the recommendations of the same systems, ...
    brings a smile to my face.

  • Iâ(TM)d be insulted to be given an AI interview.. but I also understand that HR/Recruiters are dealing with a large amount of un-qualified applicants. For non entry level positions Iâ(TM)d be okay paying some type of small fee to apply for a job.. like make it 20-50 dollars just to know that my resume gets in front of someone who can understand my skill set.
  • by MpVpRb ( 1423381 ) on Saturday February 13, 2021 @09:04PM (#61061166)

    ..it's hard to imagine it would be worse than doing it by hand.
    I was a senior engineer at a prestigious company. I got handed a stack of 200 resumes and was asked to review them. By the 50th one or so, I had turned into a robot. A piss-poor, tired robot that made an increasing number of mistakes as time went on. It would be great if AI could be developed to do this better

    That being said, combing through a pile of resumes is a poor way to find workers. Resumes are a poor description of people and demonstrate resume writing skill more than job performance. Every job I ever got was because I was recommended by someone who had worked with me in the past

    • by MrL0G1C ( 867445 )

      ..it's hard to imagine it would be worse than doing it by hand.

      Then you have simply admitted you have no imagination. Ironically you'd likely perform poorly in tests judged by the AI in question.

  • I had three fucking middle managers! They were all younger than me and, frankly, fucking morons. I was fired for being "condescending". No shit?! I singlehandedly could of easily replaced that triplet of nitwits and replaced the entire team with a simple self guided troubleshooter. It's one of the most disorganized organizations I've ever worked at. I only worked there a week. Shadowing their tech support, which their resolution rates were fucking terrible. Less than half. I told them, how about we sug
  • I think Cathy O'Neil's book Weapons of Math Destruction had a chapter on this, some years ago. The somewhat buried lede of the book is that most of the problems with AI come from bad or negligent intent in the first place. In other words, the same old problems we've had of sexism, racism, classism, etc.

    On the plus side, seems like more people are aware of how those intersect than has maybe been true in recent times.

  • Sounds like pure pseudoscience, plus I'm sure such systems would discriminate against people who aren't completely normal mentally (i.e. ADHD or asbergers).
  • by stikves ( 127823 ) on Sunday February 14, 2021 @12:50AM (#61061588) Homepage

    Let me tell this upfront: Most HR systems, including humans, will fail 50% of the qualified candidates. Actually I am probably too generous here.

    Basically it is easier to hire someone than to fire them. So as long as they have 100x people applying to a position, and receive 5-10 viable candidates they can drop most of them, and only need to offer the job to one (or two in case the first one rejects).

    So the system can have lots of "false negatives" and we can all debate "why this otherwise great person did not even get an onsite", while the company can be confident they will not have "false positives" and not need to fire that person.

    • by gweihir ( 88907 )

      That is not the problem. The problem is that this stupid approach is likely to systematically filter out people that would be good at the job offered. People are not a random collection of skills, but a complex system where everything influences everything and quite a few combinations cannot even happen.

      • The problem is that this stupid approach is likely to systematically filter out people that would be good at the job offered. People are not a random collection of skills, but a complex system where everything influences everything and quite a few combinations cannot even happen.

        Manage does not wish to know this. Go to your room. Do not pass Go. Do not collect $200.

        • by gweihir ( 88907 )

          Indeed. I am aware. I just left a job because of stupid management. (I have others and that job was only part-time, but still.)

      • They don't usually need "the best" person in terms of skills. They need "good enough" and can probably pay them less. It doesn't matter if a few of "the best" get culled as long as a number of "good enough" make it through and they hire someone.
  • by BenBoy ( 615230 )
    You keep using that word ... I do not think it means what you think it means.
  • The questions, and your answers to them, are designed to evaluate several aspects of a jobseeker's personality and intelligence, such as your risk tolerance and how quickly you respond to situations. Or as Pymetrics puts it, "to fairly and accurately measure cognitive and emotional attributes in only 25 minutes".

    There is nothing fair about discriminating against people based upon metrics that aren't necessarily relevant to a job.

    If every company used these tests then there would be a percentage of society t

    • Fair isn't usually what they care about. And most of the time even in large companies they aren't using these systems to screen for Critical positions, they're using them to fill the "rank and file." For the few Very Important spots, they're headhunting and using existing professional networking not just printing an Ad and hoping a resume comes across.

      I'm not defending it, just stating how it is. Smaller shops can't afford to pass over good applicants and will still hand review submissions. So all widesprea

  • So I worked for a larger search engine for a while and while I wasn't personally invovled, I heard the following story:

    In order to trial the use of AI for resume screening, they tried to use AI to find a correlation between accepted candidates and their respective job performance.

    After a while, the experiment was abandoned.

    The only correlation the AI found was that the presence of the word 'certified' in the resume had a *negative* correlation with job performance.

  • Basically this will select candidates that best match some bad criteria and filter most out that could be actually good at the job this is for. It will also make very good candidates think twice about even applying.

  • ... it only has to be better than human HR. Not a high bar.
  • "AI Is Being Used to Screen Job Applicants"

    TRANSLATION:

    "AI Is Being Used to Screen Out Older Job Applicants"

    Just bias it to shit-score anyone wqho appears older than some arbitrary number (like "40") and you're good to go. No more of these old fuddy-duddys clogging up your applicant stream!

    "Awww, so sorry you flunked the AI test so better luck next time."

    Next up,
    "AI Is Being Used to Screen Out Asian Job Applicants"
    "AI Is Being Used to Screen Out Black Job Applicants"
    "AI Is Being Used to Screen Out Hispanic

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...