31 votes

The battle inside Signal - The fast-growing encrypted messaging app is developing features that would make it more vulnerable to abuse. Current and former employees are sounding the alarm.

20 comments

  1. [4]
    petrichor
    Link
    I don't think this article has much merit. An application built around encrypted messaging can not and should not have "enforcement mechanisms to identify and remove bad actors", because those...
    • Exemplary

    I don't think this article has much merit.

    An application built around encrypted messaging can not and should not have "enforcement mechanisms to identify and remove bad actors", because those will always be used to quash political dissent, full stop.

    It's a pretty straightforward tradeoff. The recent deluge of hit pieces against Signal, Telegram, and end-to-end encryption as a whole worries me; I've read three of them in the past week.

    15 votes
    1. [2]
      spit-evil-olive-tips
      (edited )
      Link Parent
      I think it's more nuanced than you're giving it credit for. The people at Signal understand that tradeoff, or they wouldn't be working at Signal in the first place. Signal, in its original form as...

      I think it's more nuanced than you're giving it credit for. The people at Signal understand that tradeoff, or they wouldn't be working at Signal in the first place.

      Employees worry that, should Signal fail to build policies and enforcement mechanisms to identify and remove bad actors, the fallout could bring more negative attention to encryption technologies from regulators at a time when their existence is threatened around the world.

      “The world needs products like Signal — but they also need Signal to be thoughtful,” said Gregg Bernstein, a former user researcher who left the organization this month over his concerns. “It’s not only that Signal doesn’t have these policies in place. But they’ve been resistant to even considering what a policy might look like.”

      Signal, in its original form as purely a messaging app, was able to sidestep the need for any sort of content policy, by virtue of their employees not being able to see any user content.

      The concern these employees have, as I read it, is that Signal is expanding beyond that original "just an extremely secure messaging app" goal, but retaining their original "technology is apolitical and we don't have to take a stand about anything" mindset.

      At the same time, employees said, Signal is developing multiple tools simultaneously that could be ripe for abuse. For years, the company has faced complaints that its requirement that people use real phone numbers to create accounts raises privacy and security concerns. And so Signal has begun working on an alternative: letting people create unique usernames. But usernames (and display names, should the company add those, too) could enable people to impersonate others — a scenario the company has not developed a plan to address, despite completing much of the engineering work necessary for the project to launch.

      Usernames implemented in this way will need to be publicly viewable and not end-to-end encrypted. Say Alice Smith is a journalist, and wants people to be able to message her using @AliceSmith instead of needing a phone number. She goes to register...and finds the name is already taken. Was it taken by another actual person named Alice Smith, or by some troll who thought it'd be funny to squat on the usernames that might be used by a bunch of journalists (and potentially, pretend to be that journalist, then leak the contact info / details of sources who thought they had confidentiality)

      If Signal wants to say "yeah, that'll happen, and our official policy is that we don't offer any recourse or appeals process, sucks to be you for not registering the name first, deal with it", then...at least that's a stance. It's not a stance I agree with, but at least it's a stance. From the description given in the article, they haven't even gotten that far.

      Similarly, am I allowed to register "HeilHitler1488" as my username and go around spamming "HeilHitler1488 would like to message you" requests to everyone I can?

      There's debatable pluses and minuses to every policy that could be used to deal with these scenarios. But I think this "our policy is not to have a policy" thing is pure head-in-the-sand thinking:

      “The response was: if and when people start abusing Signal or doing things that we think are terrible, we'll say something,” said Bernstein, who was in the meeting, conducted over video chat. “But until something is a reality, Moxie's position is he's not going to deal with it.”

      The entire history of trolling on the internet shows that any system that can be abused by trolls will be abused by trolls. I don't think it's responsible to say "we're building this system, and we'll put it out there and then see if it gets any abuse before we think about how we want to deal with the abuse".

      10 votes
      1. petrichor
        (edited )
        Link Parent
        I disagree. A messaging app is a messaging app whether it facilitates one-on-one conversations or group communication. If you can't see or extrapolate user content, you shouldn't need a content...

        The concern these employees have, as I read it, is that Signal is expanding beyond that original "just an extremely secure messaging app" goal, but retaining their original "technology is apolitical and we don't have to take a stand about anything" mindset.

        I disagree. A messaging app is a messaging app whether it facilitates one-on-one conversations or group communication. If you can't see or extrapolate user content, you shouldn't need a content policy.

        Usernames implemented in this way will need to be publicly viewable and not end-to-end encrypted. Say Alice Smith is a journalist, and wants people to be able to message her using @AliceSmith instead of needing a phone number. She goes to register... and finds the name is already taken.

        I didn't touch on it above to be succinct, but all the talk of "usernames" within the article is baseless speculation. Signal has not made any indications of going this route and it's very unlikely that they will.

        A phone number currently serves two purposes in Signal:

        1. to act as a semi-unique identifier, which when coupled with a passphrase, forms your Signal Profile (which is private and encrypted),
        2. and as a form of identity to initially connect to others.

        Whatever replacement system Signal implements with is unlikely to depend on the former to solve the latter. A common solution is to require you to verify a friend's identity through other means of communication or by a QR code, like how Element / Matrix currently cross-signs devices.

        Similarly, am I allowed to register "HeilHitler1488" as my username and go around spamming "HeilHitler1488 would like to message you" requests to everyone I can?

        Sure. You'll find you can already do that by setting your display name, if you so choose. But "everyone you can" is limited - it's not like there's some public directory of Signal Profiles.

        Edit: To elaborate on this, you currently supply a phone number to Signal like you're going to text somebody. Signal grabs this, hashes it, finds the account associated with the hash, and starts up a conversation. The only way for, say, a Tildes member to contact me is through a mutual contact or by asking me for my number.

        The entire history of trolling on the internet shows that any system that can be abused by trolls will be abused by trolls. I don't think it's responsible to say "we're building this system, and we'll put it out there and then see if it gets any abuse before we think about how we want to deal with the abuse".

        I think that's a perfectly responsible approach. What kind of abuse are you expecting?

        5 votes
    2. GoingMerry
      Link Parent
      I agree. The article feels like it’s assuming all content should be moderated, which is a new-ish paradigm that actually runs contrary to the original spirit of the internet (IMO). Do creators of...

      I agree. The article feels like it’s assuming all content should be moderated, which is a new-ish paradigm that actually runs contrary to the original spirit of the internet (IMO).

      Do creators of a tool have a responsibility when said tools are used to hurt others? Are some tools inherently “bad”? These are questions people have been struggling with for ages. Cryptography used to be illegal for export from the US for these reasons, but the good things it’s brought us in the years since seem to outweigh the bad.

      Personally I’m cheering Signal on. They’ve set up their system to get around the anti-privacy five eyes. I’m obviously anti-terrorism, and I think it’s possible for law-enforcement to stop threats without requiring all citizens to give up their privacy. To say otherwise is a false dichotomy in my opinion.

      6 votes
  2. [15]
    kfwyre
    Link
    I'm genuinely torn on what to think about things like Signal. I use it myself, and I consider end-to-end encryption of my private messages to be very valuable. I'm absolutely on board with the...

    I'm genuinely torn on what to think about things like Signal. I use it myself, and I consider end-to-end encryption of my private messages to be very valuable. I'm absolutely on board with the ideal that the only people who should be able to read my communications are the people for whom they're intended.

    On the other hand, I also think that platforming hate groups is wrong and dangerous, and I think speech changes from a form of "private" to "public" once its audience surpasses a certain size (the cutoff of which I can't pin down). A Signal user messaging 1 or 3 or 10 other Signal users feels "private" to me, but a Signal user messaging 999 other users in a group message feels "public" to me. I understand that these are not actually what "private" and "public" mean, but I don't have better words for how those examples feel at the moment -- the intangibles of how those two types of communication differ. The protections that I feel are essential to the smaller groups seem much more dangerous when they scale up to larger and larger numbers of people.

    I say this not to make any actual point but more to point out that I genuinely don't know how to feel about that aspect of Signal or other E2EE messaging platforms. We've seen, from Twitter and Facebook, what happens when scale outgrows moderation, and it seems like Signal and others are simultaneously chasing that scale while also removing the capabilities for moderation entirely.

    Two years from now are there going to be 100,000 person neo-Nazi groups organizing on Signal, and are spammers going to be able to mass message millions of users with impunity? It seems like limiting reach is a way of limiting damage while still enabling the sort of "private" conversations that tend to happen in real life -- one-to-one and small group discussions. That way, even if neo-Nazis or spammers do join and use the platform, their scope is far narrower and their harmful actions aren't passively enabled by the platform's lack of limits. However, it also seems like most platforms aren't particularly interested in these limits, and I do acknowledge that there are benefits to groups that size as well (the article mentions BLM organizers using it, for example).

    I don't know, and I feel like I'm not knowledgeable enough in all the requisite areas of this discussion to know in the first place. We have a lot of privacy-minded, community-minded, technologically savvy experts here who can probably illuminate this for me a lot better than I'm doing myself. Anyone have any thoughts?

    20 votes
    1. [7]
      Comment deleted by author
      Link Parent
      1. [3]
        kfwyre
        Link Parent
        Ah, this actually helps me close in a little bit on what I was trying to say earlier: 1000 people on a Signal group feels like people that are being "platformed" to me, but 10 people in a group...

        Ah, this actually helps me close in a little bit on what I was trying to say earlier: 1000 people on a Signal group feels like people that are being "platformed" to me, but 10 people in a group doesn't feel like that. I realize I'm probably using the term "platform" wrong here (in the same way I did with "public" and "private" in my previous comment), but I genuinely don't have the right words here. The main idea is that I feel like there's some definable point at which the fundamental nature of communication changes once a group moves past a certain size, and communication of that type should be treated with different standards. In another sense: one feels more like talking, and the other feels more like broadcasting.

        12 votes
        1. [3]
          Comment deleted by author
          Link Parent
          1. [2]
            kfwyre
            Link Parent
            That’s definitely true, and picking on Signal can feel a little unfair when Telegram has been doing this more and longer (I think their group limit is 200K? But, to be fair, I also don’t think...

            That’s definitely true, and picking on Signal can feel a little unfair when Telegram has been doing this more and longer (I think their group limit is 200K? But, to be fair, I also don’t think they’re E2EE?).

            1000 is relatively small in the grand scheme of things, but I was also thinking about it in more local terms. There’s a rather toxic Facebook group in my area that has a benign name but should really be called “Ungrateful Parents Talk Shit About Their Children’s Teachers”. It has way less than 1000 people but still does a ton of tangible damage to our community. Something like that would have no oversight at all on Signal, but then again, I’m probably picking on Signal unfairly, as it already has no oversight on Facebook anyway! In fact, we would probably be better off with it on Signal because then it wouldn’t be open to everyone.

            Let it be known that I’m confused enough about this that I successfully talked myself out of my own point.

            12 votes
            1. Grzmot
              Link Parent
              Telegram is encrypted, but only in a certain type of chat which is separate to the main chat. And most really large groups are more akin to a Twitter feed of someone than people really...

              (I think their group limit is 200K? But, to be fair, I also don’t think they’re E2EE?).

              Telegram is encrypted, but only in a certain type of chat which is separate to the main chat. And most really large groups are more akin to a Twitter feed of someone than people really communicating.

              Telegram is open-source, but their encryption is homebrewed, and has received a lot of criticism from the cryptography community

              6 votes
      2. [3]
        skybrian
        Link Parent
        I'm not sure what platform means other than audience size, though? And it seems like audience size can grow exponentially once you can copy messages between group chats. The users can make their...

        I'm not sure what platform means other than audience size, though? And it seems like audience size can grow exponentially once you can copy messages between group chats. The users can make their own platform on those underpinnings.

        But I haven't used Signal's groups. Does Signal allow that?

        2 votes
        1. [3]
          Comment deleted by author
          Link Parent
          1. [2]
            Cycloneblaze
            Link Parent
            Do you have any links that talk about why that is? I haven't heard much criticism of Signal, besides the subject of this thread

            Marlinspike is a jerk and it's quite unfortunate that Signal is The Encrypted Chat App that everyone recommends

            Do you have any links that talk about why that is? I haven't heard much criticism of Signal, besides the subject of this thread

            4 votes
            1. [2]
              Comment deleted by author
              Link Parent
              1. SpineEyE
                Link Parent
                I think his highest priority is to offer something that is more secure/private than the current market leaders. And with that goal comes the necessity to move quickly. Tell me an alternative app...

                I think his highest priority is to offer something that is more secure/private than the current market leaders.

                And with that goal comes the necessity to move quickly. Tell me an alternative app that users can install on their iPhone and Android phones with a similar amount or growth rate of features as Signal.

                He might think a little too high of himself, I don't know him personally, but ultimately he offers us a better, more private/secure, alternative than the current market leaders. Sure it's not the pinnacle and yeah you need to trust him (and his team) not to destroy the service. And you don't really have to trust him as you can reproducibly compile the app or in theory even fork the project and use your own servers.

                But the important point here is that it is progress to the status quo and therefore I don't understand why you should not recommend it to the mainstream, i.e. people currently using Whatsapp, Telegram, Viber, etc.

                4 votes
    2. [5]
      spit-evil-olive-tips
      Link Parent
      The critical distinction in my mind - the group chats I have in Signal are all people I actually know. If Signal disappeared overnight, we'd find another messaging app, and coordinate the move...

      A Signal user messaging 1 or 3 or 10 other Signal users feels "private" to me, but a Signal user messaging 999 other users in a group message feels "public" to me. I understand that these are not actually what "private" and "public" mean, but I don't have better words for how those examples feel at the moment -- the intangibles of how those two types of communication differ.

      The critical distinction in my mind - the group chats I have in Signal are all people I actually know. If Signal disappeared overnight, we'd find another messaging app, and coordinate the move because we all have each others' phone numbers and other contact info. We don't actually need the privacy of Signal, but we're all geeky enough that we like sending each other memes and cat gifs using state-of-the-art encryption.

      The dividing line to me is once it's no longer an "everyone knows everyone" group. In my group chats with friends, I'm accountable. If I started sharing misinformation or whatever, they'd call me on my bullshit, and if I continued I'd risk losing those friends. That's not possible once it's a group chat with people whose opinion of me I don't care about.

      The most worrying detail to me is this - Signal is adding features that make these "too large to be accountable" groups easier to create:

      On October 28th, Signal added group links, a feature that has become increasingly common to messaging apps. With a couple of taps, users could begin creating links that would allow anyone to join a chat in a group as large as 1,000 people. And because the app uses end-to-end encryption, Signal itself would have no record of the group’s title, its members, or the image the group chose as its avatar. At the same time, the links make it easy for activists to recruit large numbers of people onto Signal simultaneously, with just a few taps.

      10 votes
      1. [3]
        skybrian
        Link Parent
        A counterexample might be Facebook, though, where misinformation is spread by people you know and when you tell them it's bullshit, they just get mad and insist on their right to post whatever...

        A counterexample might be Facebook, though, where misinformation is spread by people you know and when you tell them it's bullshit, they just get mad and insist on their right to post whatever they like.

        You can stop following them, and then it's nice to just talk to your friends and that means you don't see that stuff, but it's still going to be out there.

        6 votes
        1. [2]
          whispersilk
          Link Parent
          While what you say about Facebook is true (or at least can be true—I know Facebook can also feed you things from pages and things from friends-of-friends that your friends have interacted with)...

          While what you say about Facebook is true (or at least can be true—I know Facebook can also feed you things from pages and things from friends-of-friends that your friends have interacted with) but a lot of that stuff is those people seeing a "news" article or bad meme and hitting share, not actually presenting views they developed themselves or engaging in any real way. While it would be possible to dump a link or meme in a Signal group as well, the friction involved in doing so would be substantially higher.

          To put it more directly, while your view of Facebook is an "everyone knows everyone" group if you don't subscribe to any pages or groups, the Facebook ecosystem as a whole is very much not, and Facebook makes it frictionless for someone you know to signal-boost views that originate far outside of your circle.

          8 votes
          1. skybrian
            Link Parent
            On Facebook I also see cut-and-paste text memes and people sharing meme images they uploaded themselves. Meme sharing is popular because people like the memes. Not having a quick reshare might cut...

            On Facebook I also see cut-and-paste text memes and people sharing meme images they uploaded themselves. Meme sharing is popular because people like the memes. Not having a quick reshare might cut down at the volume a bit and that’s helpful, but people will learn the workarounds.

            It is true that sharing a meme doesn’t necessarily mean they really believe it or have given it much thought, but it’s still shared. What does sharing a Bernie meme say about your beliefs?

            Simple tricks like having text that says “only true patriots will share this” seem to work really well. The text memes often explicitly instruct the reader to copy and paste them.

            I think the only real solution is having a host or mods that tell people not to share memes.

            3 votes
      2. kfwyre
        Link Parent
        Yes! This is helpful and gets at what I was trying to get at. I feel like there’s a distinction between “social circle” and “organization” that’s primarily identified through group size.

        The dividing line to me is once it's no longer an "everyone knows everyone" group.

        Yes! This is helpful and gets at what I was trying to get at. I feel like there’s a distinction between “social circle” and “organization” that’s primarily identified through group size.

        4 votes
    3. [2]
      joplin
      Link Parent
      This sounds similar to me to Dunbar's Number: Once you're in a group where you've surpassed that number you have some cover of anonymity with the people who don't know you, and you become unable...

      A Signal user messaging 1 or 3 or 10 other Signal users feels "private" to me, but a Signal user messaging 999 other users in a group message feels "public" to me. I understand that these are not actually what "private" and "public" mean, but I don't have better words for how those examples feel at the moment -- the intangibles of how those two types of communication differ.

      This sounds similar to me to Dunbar's Number:

      Dunbar's number is a suggested cognitive limit to the number of people with whom one can maintain stable social relationships—relationships in which an individual knows who each person is and how each person relates to every other person.

      Once you're in a group where you've surpassed that number you have some cover of anonymity with the people who don't know you, and you become unable to keep track of all the other people in the group, and the dynamics change. (And obviously Dunbar was talking about in-real-life relationships, not number of people in an internet chat, so it's not a 1:1 correspondence, but it seems related.)

      9 votes
      1. TeMPOraL
        Link Parent
        I've been long maintaining the belief that crossing the Dunbar's number is the point at which a group has to split, or introduce some sort of hierarchical governance system, because it can no...

        I've been long maintaining the belief that crossing the Dunbar's number is the point at which a group has to split, or introduce some sort of hierarchical governance system, because it can no longer rely solely on interpersonal relationships to foster cooperation and police behavior. The hierarchies may appear earlier (particularly, if the group is a named thing to which people voluntarily join - the hierarchy starts with the people most active in building the group), but there's no need for any kind of formal process as long as everyone repeatedly interacts with everyone else.

        6 votes
    4. skybrian
      Link Parent
      I don't know either, but this feels to me like the rise of a new tech giant. Maybe not in terms of money, but in influence. At first everyone likes them (all the cool kids anyway), there is rapid...

      I don't know either, but this feels to me like the rise of a new tech giant. Maybe not in terms of money, but in influence. At first everyone likes them (all the cool kids anyway), there is rapid growth, usage changes due to becoming mainstream in a way that the company can't govern (and at first didn't want to govern), and then everyone hates them, while still using them.

      If so, it will disprove certain theories about how being for-profit or depending on advertising is what makes this happen.

      6 votes
  3. [2]
    Comment deleted by author
    Link
    1. petrichor
      Link Parent
      "Content policy" plays into this if Signal can identify what content is spread on their networks, at which point it becomes no different than, say, Instagram. They've generally been unable to...

      "Content policy" plays into this if Signal can identify what content is spread on their networks, at which point it becomes no different than, say, Instagram.

      They've generally been unable to identify this in the past due to end-to-end encryption working and metadata about who's connecting to who being kept to the absolute minimum[a]. But if they logged more, like who's connecting to who, the general content within the messages could be extrapolated without the actual messages themselves being leaked. That's where a "content policy" would come in.

      The article argues that this could be used to boot off Nazis. More likely than not, this would be used to boot off journalists, or identify whistleblowers.

      7 votes