15 votes

US Congress demands Jeff Bezos explain Amazon’s face recognition software

5 comments

  1. [4]
    Pugilistic
    Link
    I never knew about the racial bias in these systems. It makes sense though given the erroneous data pool these machines are being fed. That makes this tech even more terrifying for me. I wish...

    I never knew about the racial bias in these systems. It makes sense though given the erroneous data pool these machines are being fed. That makes this tech even more terrifying for me. I wish these programs could be shut down as the state doesn't need anymore power than it already has. This is the type of technology that is just begging to be used maliciously.

    9 votes
    1. Neverland
      (edited )
      Link Parent
      It’s not just malicious use that I’m worried about, it’s inept use, which in this case is use with default settings. Amazon’s defense of the false positives was that the ACLU used the system...

      This is the type of technology that is just begging to be used maliciously.

      It’s not just malicious use that I’m worried about, it’s inept use, which in this case is use with default settings.

      Amazon’s defense of the false positives was that the ACLU used the system “incorrectly.” Meaning that they just used the default setting of 80% confidence, and not the higher recommended setting for law enforcement. Maybe law enforcement should be banned from using it at settings that do not fit the task at hand? This type of decision being left in corporate hands scares the crap out of me.

      While I applaud the ACLU for making this hearing happen, I have zero faith in the US Congress for doing anything with this information. Regulation, why that’s communist!

      I’m ready for the EU invasion of the USA as far as regulations go. It’s no wonder authoritarian wannabes across the world are trying to break up the EU.

      9 votes
    2. [2]
      PsychoPitcher
      Link Parent
      The computer just needs to be fed better (more diverse) and more data sets to eliminate racial bias. But I agree with your second point that this could be very bad when used maliciously.

      The computer just needs to be fed better (more diverse) and more data sets to eliminate racial bias. But I agree with your second point that this could be very bad when used maliciously.

      1. [2]
        Comment deleted by author
        Link Parent
        1. PsychoPitcher
          Link Parent
          It will come up with a low % match. And thus not be used.

          It will come up with a low % match. And thus not be used.

  2. Neverland
    Link
    Whoever planned this at the ACLU is a genious.

    Whoever planned this at the ACLU is a genious.

    3 votes