15 votes

Amazon scraps secret AI recruiting tool that showed bias against women

13 comments

  1. [3]
    Comment deleted by author
    Link
    1. [2]
      Gaywallet
      Link Parent
      So was the analysis of the employees. What makes a good employee? This is honestly a very subjective thing, especially when you're talking about a high skill job such as one that requires an...

      The code was still designed by people.

      So was the analysis of the employees.

      What makes a good employee? This is honestly a very subjective thing, especially when you're talking about a high skill job such as one that requires an advanced degree or years of experience.

      6 votes
      1. Archimedes
        Link Parent
        Right. I don't know how you could possibly give the AI an unbiased dataset to learn from. How can you train it other than seeing how well it matches human selections, which have all sorts of biases.

        Right. I don't know how you could possibly give the AI an unbiased dataset to learn from. How can you train it other than seeing how well it matches human selections, which have all sorts of biases.

        4 votes
  2. [7]
    eladnarra
    Link
    So... uh... this just perpetuates discrimination in new ways, how fun (/s). Disabled people, people with accents, potentially women (if they mainly train on video of men)-- all of these folks will...

    Other companies are forging ahead, underscoring the eagerness of employers to harness AI for hiring.

    Kevin Parker, chief executive of HireVue, a startup near Salt Lake City, said automation is helping firms look beyond the same recruiting networks upon which they have long relied. His firm analyzes candidates’ speech and facial expressions in video interviews to reduce reliance on resumes.

    “You weren’t going back to the same old places; you weren’t going back to just Ivy League schools,” Parker said. His company’s customers include Unilever PLC (ULVR.L) and Hilton.

    So... uh... this just perpetuates discrimination in new ways, how fun (/s). Disabled people, people with accents, potentially women (if they mainly train on video of men)-- all of these folks will not necessarily match whatever the system decides is "good" speech or facial expressions. People conducting interviews can, consciously or subconsciously, be affected by job seekers seeming "different" from them, but this just codifies it.

    It's also really creepy, holy crap. Imagine being in an interview and knowing that not only are you being judged by the people interviewing you, but also some opaque algorithm that might decide your smiles didn't look genuine (or whatever it is they're actually looking at).

    7 votes
    1. [6]
      clerical_terrors
      Link Parent
      Genuine question: do you think being judged by a machine is worse in that case than being judged by a human?

      Genuine question: do you think being judged by a machine is worse in that case than being judged by a human?

      3 votes
      1. [2]
        eladnarra
        Link Parent
        I don't think that being judged by a machine is "worse" in isolation, but when combined with already existing human bias these sorts of algorithms reinforce and perpetuate that bias, especially...

        I don't think that being judged by a machine is "worse" in isolation, but when combined with already existing human bias these sorts of algorithms reinforce and perpetuate that bias, especially when we so easily fall into the trap of thinking that they are more objective. If an algorithm spits out "not a good fit," that justifies any feelings an interviewer might have had that were rooted in unexamined bias, leaving even less room for them to learn how to combat that bias.

        (And on a personal level, I have experience interacting with humans in a work environment and in interviews, so at least that's a known quantity when interviewing for a job.)

        9 votes
        1. clerical_terrors
          Link Parent
          That's a fair answer, I hadn't really considered it from that angle.

          That's a fair answer, I hadn't really considered it from that angle.

          5 votes
      2. [2]
        Catt
        Link Parent
        I personally don't care who/what evaluates me as long as it's fair. I think the issue is when companies pretend to be objective, and free themselves from biases, when they clearly exist. Canada...

        I personally don't care who/what evaluates me as long as it's fair.

        I think the issue is when companies pretend to be objective, and free themselves from biases, when they clearly exist.

        Canada recently had something similar in the news about immigration. System was suggested to use existing and historical cases to determine future, but of course that data set is not a perfect, unbiased set.

        3 votes
        1. [2]
          Comment deleted by author
          Link Parent
          1. Catt
            Link Parent
            Extremely good point. I know I definitely added keywords to my resume when I learned that some companies were filtering resumes based on them. I would definitely try to game the system.

            Extremely good point. I know I definitely added keywords to my resume when I learned that some companies were filtering resumes based on them. I would definitely try to game the system.

            1 vote
      3. witchbitch
        Link Parent
        The problem with being judged by a machine is that people assume that a machine is necessarily "better" at judging, and so they will often defer to it. It's a kind of misplaced "Appeal To...

        The problem with being judged by a machine is that people assume that a machine is necessarily "better" at judging, and so they will often defer to it. It's a kind of misplaced "Appeal To Authority"; we assume the machine MUST be objective, because it's not a human and therefore surely it's without human biases, right? But that's failing to account for the fact that its goals are set by humans, and what it considers success is set by humans, and therefore it is necessarily at LEAST as fallible as humans. AIs that judge this kind of highly subjective and complex thing are always going to have a ceiling on how successful they are, and that success ceiling will be the same as the best human who designed its brain. But there's no realistic floor to how bad its failure can be.

        So, while a machine is not inherently worse at this kind of thing, it IS problematic because people will wrongly assume that it MUST be better by virtue of being a machine, and therefore will be too inclined to not question its judgement.

        3 votes
  3. [3]
    Soptik
    Link
    There was a test in Australia, when they started blind recruiting (the recruiter didn't know gender of the candidate) to boost rates of woman/man hires. It didn't help. It actually made the thing...

    AI recruiting tool that showed bias against women

    There was a test in Australia, when they started blind recruiting (the recruiter didn't know gender of the candidate) to boost rates of woman/man hires. It didn't help. It actually made the thing worse.

    Blind recruitment trial to boost gender equality making things worse, study reveals

    Blind recruitment means recruiters cannot tell the gender of candidates because those details are removed from applications.

    In a bid to eliminate sexism, thousands of public servants have been told to pick recruits who have had all mention of their gender and ethnic background stripped from their CVs.

    "We anticipated this would have a positive impact on diversity — making it more likely that female candidates and those from ethnic minorities are selected for the shortlist," he [Professor Michael Hiscox] said.

    The trial found assigning a male name to a candidate made them 3.2 per cent less likely to get a job interview.

    Adding a woman's name to a CV made the candidate 2.9 per cent more likely to get a foot in the door.

    "We should hit pause and be very cautious about introducing this as a way of improving diversity, as it can have the opposite effect," Professor Hiscox said.

    5 votes
    1. [2]
      Askme_about_penguins
      Link Parent
      I don't understand that article. They stripe information that reveals the gender but leave the name which often points to a specific gender (Jonathan, Lisa...)? That doesn't make sense to me....

      I don't understand that article.

      They stripe information that reveals the gender but leave the name which often points to a specific gender (Jonathan, Lisa...)?

      That doesn't make sense to me.

      Plus, it contradicts itself?:

      A measure aimed at boosting female employment in the workforce may actually be making it worse, a major study has found.

      The trial found assigning a male name to a candidate made them 3.2 per cent less likely to get a job interview.

      Adding a woman's name to a CV made the candidate 2.9 per cent more likely to get a foot in the door.

      So, do women get fewer jobs, but more interviews?

      Is my brain not working right, or is the article full of inconsistencies?

      1. Soptik
        Link Parent
        As I understand it, CVs were stripped of names (instead of Elise Smith, E. Smith or even not having name at all) and another gender information. "Get a foot in the door" probably means "got an...

        As I understand it, CVs were stripped of names (instead of Elise Smith, E. Smith or even not having name at all) and another gender information. "Get a foot in the door" probably means "got an interview".

        The point of article is "Woman already get more interviews, anonymizing gender info can actually decrease number of hired woman".

  4. clerical_terrors
    Link
    Probably why other companies are going to continue researching this, even if Amazon threw in the towel. Whoever gets a good (enough) model going is going to be able to sell it.

    “Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

    Probably why other companies are going to continue researching this, even if Amazon threw in the towel. Whoever gets a good (enough) model going is going to be able to sell it.

    1 vote