14 votes

Gender, race, and intersectional bias in resume screening via language model

2 comments

  1. sparksbet
    Link
    Hardly surprising, as there's no reason we'd expect them to not have the same problems as earlier language models in this respect, which we know are incredibly good at being biased in this way...

    Hardly surprising, as there's no reason we'd expect them to not have the same problems as earlier language models in this respect, which we know are incredibly good at being biased in this way even when we try to eliminate sources of information about race and gender. These are pattern-finding machines, after all -- and these patterns are deeply embedded into our world and our language use. The unfortunate thing is that unlike previous high-profile examples of biased models used for resume screening, I doubt this is going to stop companies from using them for this purpose. Which is just going to compound the issue and further disadvantage those who are already at a disadvantage.

    6 votes