31 votes

EU Council has withdrawn the vote on Chat Control

14 comments

  1. [6]
    Protected
    Link
    Of course Hungary is going to try again. I did send the portuguese mission a fairly thorough explanation of why this is a terrible idea (complete with bold highlights to make it easier to parse),...

    Of course Hungary is going to try again.

    I did send the portuguese mission a fairly thorough explanation of why this is a terrible idea (complete with bold highlights to make it easier to parse), I hope it was/is any use. We just need to keep saying no.

    18 votes
    1. [5]
      Oslypsis
      Link Parent
      Would you mind posting it here? I'm curious to know exactly the reasons why it's a bad idea. I know it is, but idk why it is. Y'know?

      Would you mind posting it here? I'm curious to know exactly the reasons why it's a bad idea. I know it is, but idk why it is. Y'know?

      5 votes
      1. [3]
        CannibalisticApple
        Link Parent
        Not who you asked, but a basic summary: most data is encrypted so that only people with the keys can view it in an unscrambled state. For instance, data stored by Google would be indecipherable to...

        Not who you asked, but a basic summary: most data is encrypted so that only people with the keys can view it in an unscrambled state. For instance, data stored by Google would be indecipherable to a hacker unless they also get a key to decrypt it. Google employees do have keys though, and can view encrypted data if they have the right permissions.

        With end-to-end encryption, it's specifically set up so that only the sender and recipient of messages/data can read them. If you're using Facebook messenger, Facebook won't have a key to decrypt it, so your conversations are private. A random Facebook employee can't log in and see your conversations. If a hacker intercepts your messages, they can't see them either (well, unless they hacked your account to log into it).

        This is crucial for privacy and security because some information is VERY sensitive. Think about legal communications, health, financial discussions that involve bank account details, discussions or sharing of classified materials, etc. There's a lot of information that, if intercepted by the wrong party, could have catastrophic consequences. Even outside of messengers and the like, a lot of password managers (if not all) use it because if those get hacked, you're screwed.

        A lot of governments dislike it because that security means they can't read the messages either. This means they can't force companies to turn over conversations with potential evidence of terrorism and other crimes, which is basically the main argument I see against end-to-end encryption. However, abolishing it entirely like some try to insist would put any information shared online at high risk, and also create plenty of backdoors for rogue employees of major companies like Google to cause pretty serious havoc.

        15 votes
        1. [2]
          Macha
          (edited )
          Link Parent
          It's also worth mentioning that even the on-device scanning has issues: A lot of the proposals are calling for AI similarity checking to the types of illegal content, but this has a lot of...

          It's also worth mentioning that even the on-device scanning has issues:

          1. A lot of the proposals are calling for AI similarity checking to the types of illegal content, but this has a lot of problems. The german police already complain about a very high false positive rate even with systems that are supposed to be checking against known images. If you think about the volumes of images on these platforms, then you're talking hundreds of millions of totally innocent images a day, and even millions of legal images shared by consenting adults with each other, with maybe 100s or 1,000s of the type of images these are supposed to be detecting. So you might think a 0.1% inexplicable false positive rate with a 10% false positive on legal but pornographic images means that 10% of the reports are wasting time, but actually it might mean you have 100,000 totally innocent images, 10,000 images of adults legally sharing nude photos with each other, and 100 or so of the images you're trying to detect. So less than 1% of reports might be valid in with those numbers.
          2. Some amount of the illegality depends on the context too, which means the answer to fixing this is not just "well soon they'll make better models". If one parent sends a picture of their kid in a swimsuit to the other parent to show how excited the kid is for their first swimming lesson, that is not CSEM. Nor is it if that kid is in the background of someone else's beach photo. However, if someone is selling a collection of 1,000 kids in swimsuits, then it probably is, despite being an aggregation of the same photos.
          3. What's your approach for verifying those claims? A lot of the suggestions are the images are going to be sent to human moderators/police before action is taken. Now think of how many police officers, moderators, etc. there are. It's not even a numbers game, at that scale there are going to be cases where some of them save a few copies for themselves. Certainly of the false positives, but possibly of the illegal content too. After all, there's a few tens of cases a year of police officers stalking their exes etc.
          4. How do you stop e.g. a returning PiS or Orban deciding that photos of LGBT content are now to be detected by this system too? We've seen with China and other authoritarian regimes that once such a system exists, the tech companies will give access for other reasons.
          5. What's to stop mission creep of such a system? The MPAA hear how great it's been at detecting content based on an AI model, so let's get in on detecting copyright violations next. (After all, much of the infrastructure used for anti-piracy blocks at ISPs was originally introduced in the name of blocking terrorism and drug sales and the like)
          20 votes
          1. Protected
            Link Parent
            You two got most of it. I explained how this mechanism leaks supposedly secure communications data by intercepting it before the tunnel, and how this undermines said security and can put people...

            You two got most of it. I explained how this mechanism leaks supposedly secure communications data by intercepting it before the tunnel, and how this undermines said security and can put people and sensitive relationships at risk. I mentioned the fallibility of AI and how it would quickly result in the private data of innocents having to be sent to other humans. I also mentioned this would likely backfire, presenting as examples companies blocking europeans after the GDPR and aggregators blocking canadian news agencies after C-18 (some companies did already threaten to pull out of the EU if chatcontrol is enacted). Finally, I mentioned the incompatibility of the regulation with the fundamental rights of european citizens and its incongruency with national laws.

            6 votes
      2. caliper
        Link Parent
        ELI5: let’s assume all houses use the same brand locks on the front door. Police decides to ask the company that makes these locks to create a master key to enter all houses to be able to catch...

        ELI5: let’s assume all houses use the same brand locks on the front door. Police decides to ask the company that makes these locks to create a master key to enter all houses to be able to catch more criminals. Not bad, right? But since the master key exists, you can be sure it will be copied and end up in the wrong hands. Not just burglars, but also corrupt policemen. Now nobody’s house is safe anymore.

        This will happen with these proposals too: a back door is created and history has already proven access will fall into the wrong hands. Just think how awesome this back door will be for a dictatorship. Or a country trying to overthrow a government in another country. Politicians use these types of communication too to talk about state secrets. Journalists use it to investigate wrongdoing. Privacy is a very important part of a free society.

        4 votes
  2. [4]
    tauon
    (edited )
    Link
    Discussion on PrivacyGuides.net Original source: in German, update: also available in English Primary source: EU council livestream The German article calls this a partial victory, but the vote...

    Discussion on PrivacyGuides.net

    Original source: in German, update: also available in English

    Primary source: EU council livestream

    The German article calls this a partial victory, but the vote has really “only” been called off as the (Belgian) council leadership seemingly realized that there wouldn’t be a majority, had this come to a vote today.

    The upcoming Hungarian council presidency, which will take over from July, have already announced they intend to further pursue this (IMHO awful) piece of legislation.


    Context: “Chat control” – which really is just a poor translation from the term coined by the vocal German opponents – is the EU’s attempt at, effectively, removing end-to-end encryption under the pretense of CSAM scanning.

    As mentioned in this article too, messenger-with-privacy-focus apps Signal and Threema had previously announced the end of their services being offered within the EU, should this have come to pass (or will pass at a future point).

    13 votes
    1. [2]
      riQQ
      Link Parent
      Netzpolitik translates some of its articles to English. The one you posted was translated and is available in English here:...

      Netzpolitik translates some of its articles to English. The one you posted was translated and is available in English here:
      https://netzpolitik.org/2024/victory-for-now-no-majority-on-chat-control-for-belgium/

      4 votes
      1. tauon
        Link Parent
        Ohh, I totally wasn’t aware of that, nice!! Thanks for the pointer.

        Ohh, I totally wasn’t aware of that, nice!! Thanks for the pointer.

        3 votes
    2. Nsutdwa
      Link Parent
      So does that mean that Signal (for exmaple) would just stop working? Signal has to maintain servers, right? Would it be illegal to use a distributed network of self-hosted chat servers? Would it...

      So does that mean that Signal (for exmaple) would just stop working? Signal has to maintain servers, right? Would it be illegal to use a distributed network of self-hosted chat servers? Would it be illegal for me to exchange messages with my family that we encrypt using our own homebrew solutions, for example? This would be madness - because you can be sure that the actual child traffickers (for example) are not going to just go "Oh no! We can't use encryption. Guess we'll just have to organise our crimes in cleartext messages, then, you got us."

      4 votes
  3. [5]
    Comment removed by site admin
    Link
    1. [4]
      caliper
      Link Parent
      Unbelievable, right? I do not understand people not seeing through this paper thin argument. It’s unfortunate a lot of politicians are dumb as doorknobs and will never understand why this is a...

      Unbelievable, right? I do not understand people not seeing through this paper thin argument. It’s unfortunate a lot of politicians are dumb as doorknobs and will never understand why this is a terrible idea.

      6 votes
      1. [3]
        winther
        Link Parent
        That this sort of proposal keeps coming up year after year, makes me think they know exactly what they are doing. They want the fascist super surveillance on everyone through the backdoor and are...

        That this sort of proposal keeps coming up year after year, makes me think they know exactly what they are doing. They want the fascist super surveillance on everyone through the backdoor and are just trying to find the right moment to get it in. I know the saying of not ruling out incompetence instead of malice, and maybe 15-20 years ago I would agree. It could be down to politicians lacking the techincal expertise to understand what they are actually proposing and it was just a kneejerk reaction to try and do something about CSAM. But that explanation that it is just tech-illiteracy is becoming harder to believe every year.

        8 votes
        1. [2]
          tauon
          Link Parent
          I was very glad to hear of this quote from the German Secretary of Justice Buschmann: (Translated with DeepL) Definitely the sort of thing you’re relieved to hear after reading that apparently...

          I was very glad to hear of this quote from the German Secretary of Justice Buschmann:

          “Germany is a country that has already experienced two dictatorships that had no regard for privacy. That is why we are particularly sensitive there and will take great care to ensure that privacy, especially private communication, remains protected.”

          (Translated with DeepL)

          Definitely the sort of thing you’re relieved to hear after reading that apparently France’s rejection of the legislation has been crumbling recently.

          But then again, the next federal government’s minister of a different party might do a 180 in favor of the abusable “super surveillance” (great phrase, btw).

          6 votes
          1. winther
            Link Parent
            That is true. My issue is not so much with the current leadership, but a potential future government. If AfD came into power in Germany, we certainly don't want them to have that kind of...

            That is true. My issue is not so much with the current leadership, but a potential future government. If AfD came into power in Germany, we certainly don't want them to have that kind of surveillance tool at their disposal (among a million other things). That the current government promise they will only use for "good" is no guarantee for the future.

            2 votes