16 votes

Congress demands Jeff Bezos explain Amazon’s face recognition software

7 comments

  1. [6]
    Pugilistic Link
    I never knew about the racial bias in these systems. It makes sense though given the erroneous data pool these machines are being fed. That makes this tech even more terrifying for me. I wish...

    I never knew about the racial bias in these systems. It makes sense though given the erroneous data pool these machines are being fed. That makes this tech even more terrifying for me. I wish these programs could be shut down as the state doesn't need anymore power than it already has. This is the type of technology that is just begging to be used maliciously.

    10 votes
    1. Neverland (edited ) Link Parent
      It’s not just malicious use that I’m worried about, it’s inept use, which in this case is use with default settings. Amazon’s defense of the false positives was that the ACLU used the system...

      This is the type of technology that is just begging to be used maliciously.

      It’s not just malicious use that I’m worried about, it’s inept use, which in this case is use with default settings.

      Amazon’s defense of the false positives was that the ACLU used the system “incorrectly.” Meaning that they just used the default setting of 80% confidence, and not the higher recommended setting for law enforcement. Maybe law enforcement should be banned from using it at settings that do not fit the task at hand? This type of decision being left in corporate hands scares the crap out of me.

      While I applaud the ACLU for making this hearing happen, I have zero faith in the US Congress for doing anything with this information. Regulation, why that’s communist!

      I’m ready for the EU invasion of the USA as far as regulations go. It’s no wonder authoritarian wannabes across the world are trying to break up the EU.

      10 votes
    2. [4]
      PsychoPitcher Link Parent
      The computer just needs to be fed better (more diverse) and more data sets to eliminate racial bias. But I agree with your second point that this could be very bad when used maliciously.

      The computer just needs to be fed better (more diverse) and more data sets to eliminate racial bias. But I agree with your second point that this could be very bad when used maliciously.

      1. [3]
        deadaluspark Link Parent
        What about when faced with using low-res, crappy cameras? You know, the type really designed from the ground up for people with white skin, because when you go to take a photo of a dark skinned...

        What about when faced with using low-res, crappy cameras? You know, the type really designed from the ground up for people with white skin, because when you go to take a photo of a dark skinned person, suddenly everything looks wrong.

        I mean, this problem goes back to how cameras and film were originally produced. For quite a while, you just couldn't take a photo of a dark skinned person and have them look like they do in real life. This continues to be a consistent problem on cheap security cameras, in my personal experience. Seriously, look into the early history of cameras, taking photos of anyone who wasn't white was 100% an afterthought, and due to that, the technology was designed around working for people with white skin.

        So, the better (and more diverse) will help the baseline system, but how much will it help when the camera source for the facial recognition is really crappy? Then the baseline system is still dealing with low-res, badly lit, un-white-balanced, garbage that it somehow has to parse into a result.

        2 votes
        1. [2]
          PsychoPitcher Link Parent
          It will come up with a low % match. And thus not be used.

          It will come up with a low % match. And thus not be used.

          1. deadaluspark Link Parent
            I think you're misunderstanding me. I'm saying the system has already been deployed, this is not about adding new images to the database. This is about using said database to try to recognize...

            I think you're misunderstanding me.

            I'm saying the system has already been deployed, this is not about adding new images to the database.

            This is about using said database to try to recognize people based on grainy, pixelated, or otherwise crappy video sources. These are the kind of video sources law enforcement will be putting into the system, looking for a match.

            If its the only video of the crime, do you think the cops will be willing to ignore a match, even if it is low percentage, if its their only evidence?

            Cops have railroaded people with/for far less.

  2. Neverland Link
    Whoever planned this at the ACLU is a genious.

    Whoever planned this at the ACLU is a genious.

    3 votes