24 votes

Signal’s Meredith Whittaker: AI is fundamentally ‘a surveillance technology’

4 comments

  1. [2]
    skybrian
    (edited )
    Link
    Surveillance applications are easy to imagine and scary but I don't agree with "fundamentally." Training on a public dataset is not surveillance. It can't learn from anything you didn't give it....

    Surveillance applications are easy to imagine and scary but I don't agree with "fundamentally." Training on a public dataset is not surveillance. It can't learn from anything you didn't give it.

    It's sort of like saying databases are a "surveillance technology" since sometimes databases are used that way.

    Or how about cameras? Yes, of course surveillance cameras are a thing, and lots of people buy them.

    I think the argument is better for cameras. Privacy became harder with cameras everywhere.

    12 votes
    1. Pioneer
      Link Parent
      It's a fair point when you think about complex analytical tasks. The amount of Data we've got floating around plus the cheap compute we now have allows us to create explicit and implicit data...

      It's a fair point when you think about complex analytical tasks. The amount of Data we've got floating around plus the cheap compute we now have allows us to create explicit and implicit data points about almost anyone or anything.

      Those data points can (and mostly likely will be) used as a modicum of understanding about the people that they refer to in a negative way. Liked too many of a specific poster? You get a tag that isn't accepted by society. Commented one too many times about frustration with x and y policy? There goes another tag. It won't just be how you get advertised to, it'll be how you get investigated and followed around by a hostile actor.

      Not to mention scary legislation like the "Online Safety Bill" in the UK that challenges encryption (even when the tech doesn't exist to do so) that make those data points and surveillance even easier to harvest or pursue.

      LLMs may not be designed for surveillance, but it's likely we'll see some hostile ones spring up in the none to distant future.

      8 votes
  2. raccoona_nongrata
    (edited )
    Link
    I agree with her, but I think people will continue making excuses for the tech until it's far too late, people won't get it until the next Cambridge Analytica type disaster and even then many...

    I agree with her, but I think people will continue making excuses for the tech until it's far too late, people won't get it until the next Cambridge Analytica type disaster and even then many probably still won't get it or be willing to admit the problem.

    I wish people could be convinced of the long-term, permanent harm being done and how many profoundly awful practices we're normalizing, but many simply aren't at a point where they are willing to understand. We're at an inflection point, and the future of humanity looks more and more to be a perpetually fascist one in which humans ability to organize resistance to government and corporate overreach will be so outmatched by the tools available to those entities no one will have the ability to even begin resisting.

    11 votes
  3. Amun
    Link
    Devin Coldewey Read on...

    Devin Coldewey


    Why is it that so many companies that rely on monetizing the data of their users seem to be extremely hot on AI? If you ask Signal president Meredith Whittaker (and I did), she’ll tell you it’s simply because “AI is a surveillance technology.”

    Read on...

    Whittaker explained her perspective that AI is largely inseparable from the big data and targeting industry perpetuated by the likes of Google and Meta, as well as less consumer-focused but equally prominent enterprise and defense companies. (Her remarks lightly edited for clarity.)

    “It requires the surveillance business model; it’s an exacerbation of what we’ve seen since the late ’90s and the development of surveillance advertising. AI is a way, I think, to entrench and expand the surveillance business model,” she said. “The Venn diagram is a circle.”

    “And the use of AI is also surveillant, right?” she continued. “You know, you walk past a facial recognition camera that’s instrumented with pseudo-scientific emotion recognition, and it produces data about you, right or wrong, that says ‘you are happy, you are sad, you have a bad character, you’re a liar, whatever.’ These are ultimately surveillance systems that are being marketed to those who have power over us generally: our employers, governments, border control, etc., to make determinations and predictions that will shape our access to resources and opportunities.”

    Ironically, she pointed out, the data that underlies these systems is frequently organized and annotated (a necessary step in the AI dataset assembly process) by the very workers at whom it can be aimed.

    “There’s no way to make these systems without human labor at the level of informing the ground truth of the data — reinforcement learning with human feedback, which again is just kind of tech-washing precarious human labor. It’s thousands and thousands of workers paid very little, though en masse it’s very expensive, and there’s no other way to create these systems, full stop,” she explained. “In some ways what we’re seeing is a kind of Wizard of Oz phenomenon, when we pull back the curtain there’s not that much that’s intelligent.”

    Not all AI and machine learning systems are equally exploitative, though. When I asked if Signal uses any AI tools or processes in its app or development work, she confirmed that the app has a “small on-device model that we didn’t develop, we use it off the shelf, as part of the face blur feature in our media editing toolset. It’s not actually that good… but it helps detect faces in crowd photos and blur them, so that when you share them on social media you’re not revealing people’s intimate biometric data to, say, Clearview.”

    “But here’s the thing. Like… yeah, that’s a great use of AI, and doesn’t that just disabuse us of all this negativity I’ve been throwing out onstage,” she added. “Sure, if that were the only market for facial recognition… but let’s be clear. The economic incentives that drive the very expensive process of developing and deploying facial recognition technology would never let that be the only use.”

    9 votes