14 votes

Amazon releases "Halo" wearable which analyzes emotions in the user's voice

8 comments

  1. Fal
    Link
    I got this ad today for Amazon Halo, to which I reacted with a disconcerted "what the fuck". Last year they were caught sending data from Alexa to contractors, so it seems bizarre to me to do......

    I got this ad today for Amazon Halo, to which I reacted with a disconcerted "what the fuck". Last year they were caught sending data from Alexa to contractors, so it seems bizarre to me to do... this thing.

    10 votes
  2. Greg
    Link
    Weirdly, I'm less worried about the privacy concerns (perhaps I've just burned out completely on that front?), more worried about the Goodhart's law implications. We've already seen what...

    Weirdly, I'm less worried about the privacy concerns (perhaps I've just burned out completely on that front?), more worried about the Goodhart's law implications.

    We've already seen what optimisation for clicks, shares, likes, has done to online discourse. Chasing the most addictive elements of human behaviour leads to clickbait, manufactured outrage, and oversimplification; chasing the algorithms that present that information leads to hyper-targeted, siloed, yet oddly (paradoxically) homogenous output.

    Now put all of that into the context of your tone of voice. You're literally asked to moderate your how you say things to meet a machine-defined image of what's acceptable. If this takes off, there will eventually be targets and incentives tied to this - either by social pressure, if not directly by employers and insurers.

    Think about those painfully reductive surveys you get asking to rank a customer service representative from 1-5 on these four key metrics, all of which are subjective and three of which are totally outside their personal control. Think about the honest people who lose their jobs over this, the insane and abusive customer behaviour that gets tacitly approved of in their name, and the "winners" who hang on by finding ever more creative ways to game the system. Now apply all of this to the literal way you fucking speak.

    10 votes
  3. [4]
    elcuello
    Link
    As long as I live I will NEVER trust this. But they just said...? Well, ain't that just marvelous. It's like we KNOW it's bad in the end and we KNOW it's coming but we just CAN'T do anything about...

    Amazon says all this data is kept safe and secure, with all the processing done locally on your phone, which then deletes the data. "Tone speech samples are never sent to the cloud, which means nobody ever hears them, and you have full control of your voice data," Majmudar wrote.

    As long as I live I will NEVER trust this.

    Kaltheuner said it's good that the Tone feature is opt-in, but anonymized data from Halo could still be shared in bulk with third parties.

    But they just said...?

    John Hancock announced Thursday it would be the first life insurer to integrate with Amazon Halo. "Starting this fall, *all John Hancock Vitality customers will be able to link the Amazon Halo Band to the program to earn Vitality Points for the small, everyday steps they take to try to live a longer, healthier life," *the insurance firm said in a press statement.

    Well, ain't that just marvelous. It's like we KNOW it's bad in the end and we KNOW it's coming but we just CAN'T do anything about it. Every unthinkable sci-fi movie is slowly becoming reality.

    8 votes
    1. [3]
      Greg
      Link Parent
      Classic weasel wording. They said Tone speech samples are never sent to the cloud - so speech samples used for other purposes still may be. Not to mention location, movement, linked app usage,...

      But they just said...?

      Classic weasel wording. They said Tone speech samples are never sent to the cloud - so speech samples used for other purposes still may be. Not to mention location, movement, linked app usage, WiFi networks, discoverable Bluetooth devices, and a thousand other things I haven't even thought of yet.

      6 votes
      1. [2]
        GandalfTheGrey
        Link Parent
        I interpret that to mean the sample recordings themselves are only saved locally, but the metadata on tone, content, transcription (?) are fair game for transmission.

        I interpret that to mean the sample recordings themselves are only saved locally, but the metadata on tone, content, transcription (?) are fair game for transmission.

        6 votes
        1. Greg
          Link Parent
          That's a very good point - you can shift the implication depending whether you emphasise Majmudar's first or second word. They really didn't rule out much at all there, while making it sound like...

          That's a very good point - you can shift the implication depending whether you emphasise Majmudar's first or second word. They really didn't rule out much at all there, while making it sound like they're making a major commitment to privacy.

          6 votes
  4. Elheffe
    Link
    At first I thought this would be good for someone like my kids- who have a hard time moderating what they are saying and how they are saying it. They are slightly autistic and this could be...

    At first I thought this would be good for someone like my kids- who have a hard time moderating what they are saying and how they are saying it. They are slightly autistic and this could be beneficial for them. However, there is no way in hell I would let them wear this. Privacy concerns aside, the very idea that the data of how people are speaking being collected and fed into ML is terrifying.

    3 votes
  5. Whom
    Link
    I feel sick.

    I feel sick.

    4 votes