15 votes

Amazon's face recognition falsely matched twenty-eight members of Congress with mugshots

4 comments

  1. [2]
    chocolate
    Link
    Four things. I'd like to see the photos they were falsely matched with. If they are genuinely people who look like them, that's just the way faces and facial recognition work. 'People of colour'...

    Four things.

    1. I'd like to see the photos they were falsely matched with. If they are genuinely people who look like them, that's just the way faces and facial recognition work.

    Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress.

    1. 'People of colour' being falsely matched at a higher rate could indicate one of three things. The software may be performing poorly (possible). Certain ethnicities look more similar than others (true in some cases, but certainly not for African Americans who tend to be racially mixed). There may simply be a disproportionately large pool to get a false match against (most probable, and easiest to verify as police records generally include a race).

    2. There also seems to be a massive gender disparity - only one of the 28 positives appears to be a woman. The article doesn't tell us the makeup of the sample group, but if it were 50/50 then men are 27x more likely to be flagged. If they were a proper randomised sample (as the above quote suggests) then there should still be 5-6 women flagged.

    3. Are we really sure they were false matches? Joking.

    This is a serious issue. The racial and gender disparities may be poor software, which can be fixed. If they are a case of the different relative sizes of pools, we're in trouble - no matter how accurate you get, people have doppelgangers.

    Police aren't even my biggest concern. Imagine not being able to get a job because employers start using this en masse and you just happened to look identical to the police sketch of a serial arsonist. You would probably never know, and even if you did are you going to include a disclaimer on your resume?

    4 votes
    1. Emerald_Knight
      Link Parent
      I recall reading once (I honestly can't remember where) that a particular trait or set of traits tends to vary more within certain ethnicities than in others. Thus, software that is trained...

      Certain ethnicities look more similar than others

      I recall reading once (I honestly can't remember where) that a particular trait or set of traits tends to vary more within certain ethnicities than in others. Thus, software that is trained primarily on Caucasians will likely be biased toward facial features that are more prominent and identifying among Caucasians. If those facial features don't vary as much among other ethnicities, where a different set of features is typically more prominent and identifying, then false matches are likely to occur as a result.

      I also seem to recall something about a machine learning algorithm being trained such that the images containing the type of subject intended to be identified all happened to also have copyright statements, and so it was inadvertently trained to look for those copyright statements rather than the intended subject. It's similar to the problem presented here.

      Unfortunately I don't have sources for these and a quick search isn't yielding any helpful results. If anyone wants to provide sources supporting or debunking this, please feel free to do so :)

      3 votes
  2. patience_limited
    Link
    The other troubling piece of this is the sheer number of people (predominantly people of color) who have mugshots in various criminal justice information systems in the first place. How many are...

    The other troubling piece of this is the sheer number of people (predominantly people of color) who have mugshots in various criminal justice information systems in the first place. How many are photos of juveniles, how many of people who weren't actually convicted of crimes, how many are so ill-transmitted or preserved that they're essentially too noisy for identification?

    With so many people to match against, how much greater is the likelihood of false matches?

    I say this in part because I once spent a few hours isolated at a U.S./Canadian border checkpoint, being asked some very discomfiting questions ("have you ever handled explosives?"). I was told on release that I "matched" a dubiously general physical description and a very fuzzy faxed Interpol photo.

    At the time, it was an inconvenience, if moderately chilling. There weren't hair-trigger police and opaque background checks for everything - jobs, housing, credit, child custody.

    Facial recognition is far too risky, and far too open to abuse.

    4 votes
  3. SourceContribute
    Link
    Maybe it's just predictive policing? heh.

    Maybe it's just predictive policing? heh.