24 votes

Apple introduces expanded protections for children, including on-device scanning of images to detect child abuse imagery

19 comments

  1. [7]
    stu2b50
    (edited )
    Link
    I was quite alarmed when I read about it on Twitter (it sounded like Apple was scanning all images on the device and using AI to do the cp detection) but for the most part the actual...

    I was quite alarmed when I read about it on Twitter (it sounded like Apple was scanning all images on the device and using AI to do the cp detection) but for the most part the actual implementation seems fine.

    • CSAM scanning is done only on iCloud images - seems fair to me, every other file drive/image host/etc also does this. When you host things on someone’s server naturally they get a say in what you’re allowed to upload.

    • AI nudity detection thing is for iMessage - on device it will run the model on incoming images to iPhones if it’s a child’s iCloud account, and blur the image and notify the parents. False positives seem pretty low stakes, so although I’m wary on how effective Apples ML can be it’s pretty whatever in the end. Notifying parents is also optional and up to the child to request it.

    There’s some slippery slope about the iCloud scanning but I don’t really buy it yet, it’s just Apple meeting the other providers. Why would Apple do this if not to do full device scanning? Same reason YouTube scans for copyright violations on uploaded videos. Apple doesn’t want to host child porn. Fair enough reason.

    It seems mostly slippery slope fears. Those are valid but also not certanties.

    23 votes
    1. skybrian
      Link Parent
      Yeah, it's a governance issue. We crossed a line with a vendor-supplied OS that runs automatically applied security updates, but this gets a bit closer. I'm not sure what to do about it other than...

      Yeah, it's a governance issue. We crossed a line with a vendor-supplied OS that runs automatically applied security updates, but this gets a bit closer.

      I'm not sure what to do about it other than by having open source code and some kind of transparency system for auditing security updates, at least after the fact. But transparent child-porn screening seems like a non-starter.

      What was that website that had infohazard fiction, sort of like the Paranoia role-playing game? I can't quite remember...

      7 votes
    2. [5]
      Bullmaestro
      Link Parent
      I don't like the slippery slope analogy. It looks like Apple specifically put this technology in to target child predators.

      I don't like the slippery slope analogy. It looks like Apple specifically put this technology in to target child predators.

      4 votes
      1. aditya
        Link Parent
        That's exactly why the analogy works though? About how this backdoor can be abused by unfriendly governments?

        That's exactly why the analogy works though? About how this backdoor can be abused by unfriendly governments?

        4 votes
      2. [4]
        Comment deleted by author
        Link Parent
        1. [3]
          aditya
          Link Parent
          Sorry, but I don't think they're being too dramatic at all. Why are we okay with backdoors existing and convincing ourselves that more egregious abuse "may never happen"? What do we do if / when...

          too dramatic about something that may never happen

          Sorry, but I don't think they're being too dramatic at all. Why are we okay with backdoors existing and convincing ourselves that more egregious abuse "may never happen"? What do we do if / when those abuses do happen? It's too late to walk the tech back.

          8 votes
          1. [2]
            joplin
            Link Parent
            Can you explain what you mean by "backdoor"? Usually a backdoor refers to a way that someone can get into your device. That's not what either of these features are. One happens when you...

            Can you explain what you mean by "backdoor"? Usually a backdoor refers to a way that someone can get into your device. That's not what either of these features are.

            One happens when you voluntarily give your data to Apple to host on their servers. Apple is legally obligated not to host child pornography on their servers and to remove it if they find it put there by a customer.

            The other happens on your child's phone, and only happens when you, the parent, turn it on. It sends the data to you, the parent, to review, not to the government or Apple, as I understand it.

            At no point is Apple getting into your phone or allowing the government to get into your phone, so it's not a backdoor in any sense I've ever seen the word used before.

            6 votes
            1. aditya
              Link Parent
              I responded to some of this on another thread just now. As for the definition of the word backdoor, I'm applying it here because I see this as a mechanism for Apple, and whoever is in a position...

              I responded to some of this on another thread just now.

              Yes, Apple is currently limiting scanning of photos to those being uploaded to iCloud, and yes, this can be disabled by opting out of backing it up to iCloud. However, they're doing this scanning on-device, rather than on their servers, so the "front door is open" argument, while valid, never comes into the picture. The problem with this entire thing for me is that this is a small step to go from "scan all photos on this device being uploaded to iCloud" to "scan all photos on this device". The problem isn't necessarily what photos get scanned, but rather that the mechanism is being built into the devices, and can likely be expanded quite trivially.

              As for the definition of the word backdoor, I'm applying it here because I see this as a mechanism for Apple, and whoever is in a position to coerce Apple, to potentially get at least a boolean answer to whether some content is on a device. Another commenter on this thread pointed out that China requires Apple to maintain their servers in-country so that they can continue selling devices. Now imagine China saying "we want this scanning functionality but in addition to these hashes of CSAM, we also want you to scan for images of Tienanmen Square".

              Edit: perhaps I used the word "backdoor" somewhat incorrectly though. But in cases where a boolean answer--possible presence of some "objectionable" material--suffices, I think it holds for me. Happy to be corrected.

              3 votes
  2. [2]
    Deimos
    (edited )
    Link
    This is causing a lot of uproar in privacy spaces: EFF's response: Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life A Twitter thread from Matthew Green,...

    This is causing a lot of uproar in privacy spaces:

    21 votes
    1. [2]
      Comment deleted by author
      Link Parent
      1. Deimos
        Link Parent
        That's exactly what the things I linked are about, read them for a lot of explanation.

        That's exactly what the things I linked are about, read them for a lot of explanation.

        8 votes
  3. aditya
    Link
    Extremely slippery slope here.

    Extremely slippery slope here.

    12 votes
  4. [7]
    markhurst
    Link
    From Jonathan Mayer: Matthew Green quoted in a Vice article: Interesting that when a new tech launches with built-in risks, what's invoked is "hope."

    From Jonathan Mayer:

    Unfortunately, while Apple has posted lots of detail on the crypto, there’s little on the computer vision—and no evaluation of predictive performance. That’s essential for evaluating privacy impact.

    Matthew Green quoted in a Vice article:

    It shows that Apple is willing to build and deploy this technology. I hope that they will never be asked to use it for other purposes.

    Interesting that when a new tech launches with built-in risks, what's invoked is "hope."

    7 votes
    1. [6]
      joplin
      Link Parent
      Given that the stakes are that a parent would see a non-pornographic image if the performance is poor, does it even matter? I don't know what the UI looks like, but I wouldn't surprised if there...

      there’s little on the computer vision—and no evaluation of predictive performance.

      Given that the stakes are that a parent would see a non-pornographic image if the performance is poor, does it even matter? I don't know what the UI looks like, but I wouldn't surprised if there was a button they could press to say (more or less) "false positive, let my child see it." If it has too many false positives, customers will complain and Apple will adjust or lose customers.

      I hope that they will never be asked to use it for other purposes.

      Yeah, this keeps coming up. Given how well they've done fighting, for example, the FBI when they needed to, I'm far less worried about them and far more worried about other companies. Yeah, past performance is no predictor of future performance, but in general, they seem to have the right culture for these things. If that changes, then I'll change my stance.

      4 votes
      1. stu2b50
        Link Parent
        What I don't understand is how this changes anything. At the moment, if Apple, say, gets a subpoena from the US government for someone's photos on iCloud, there is nothing that they can do to not...

        Yeah, this keeps coming up.

        What I don't understand is how this changes anything. At the moment, if Apple, say, gets a subpoena from the US government for someone's photos on iCloud, there is nothing that they can do to not send over the iCloud photos. iCloud photos are not E2E encrypted, and with the current feature set cannot be - they are not advertised as E2EE, and I don't see any reason you should think they are. You sign a ToS when you use iCloud photos.

        Apple absolutely can, and do, access the photos you upload, take down photos they don't want to host, and would likely comply with government requests - including from countries like the PRC. How does this change that? They already had that power, and that potential for abuse. Them's the apples when you upload content to cloud services. I can understand if you're uncomfortable with that, but what does this change?


        I think there's some conflation going on with the iMessage feature, since iMessage is E2EE, and there's also some random FUD over Apple scanning all photos on your device.

        7 votes
      2. [4]
        aditya
        Link Parent
        Curious, but how has Apple done when taking on other governments? We keep hearing about the FBI, and I somewhat do buy it: I use an iPhone etc. But what about requests of other countries? Do they...

        Curious, but how has Apple done when taking on other governments? We keep hearing about the FBI, and I somewhat do buy it: I use an iPhone etc. But what about requests of other countries? Do they fight back?

        3 votes
        1. [2]
          Wes
          Link Parent
          They do maintain servers in China as is required by the Chinese government. There's not much they can do about it if they wish to sell phones in China.

          They do maintain servers in China as is required by the Chinese government. There's not much they can do about it if they wish to sell phones in China.

          2 votes
          1. aditya
            Link Parent
            Now imagine if China expands these requirements but wants Apple to scan for their list of images, such as say those of Tienanmen Square? Will Apple back out?

            Now imagine if China expands these requirements but wants Apple to scan for their list of images, such as say those of Tienanmen Square? Will Apple back out?

            1 vote
        2. joplin
          Link Parent
          That's a fair question. I don't know as I haven't seen much about it in the press. Being a US company, I'm not sure how much jurisdiction other governments would have over them for data stored in...

          That's a fair question. I don't know as I haven't seen much about it in the press. Being a US company, I'm not sure how much jurisdiction other governments would have over them for data stored in the US. I know that Microsoft was fighting the US government over turning over data stored overseas, and I imagine this would be a similar situation with foreign governments for Apple.

          1 vote
  5. [2]
    Macil
    Link
    I think it's relatively fine for content to be scanned and policed when it's published online or otherwise mass-shared to others, though I think it's crossing a line when it gets to people's...

    I think it's relatively fine for content to be scanned and policed when it's published online or otherwise mass-shared to others, though I think it's crossing a line when it gets to people's private files. ... Technically in this case it seems that it's about people's personal cloud-hosted iCloud photos, but those are supposed to be end-to-end encrypted, so I'd think Apple wouldn't have responsibility over their contents. And if the iCloud photos are the default way for users to have private photos on some devices, then it's weird. Devices should have a storage that's recognized as the owner's own responsibility.

    If it makes a difference in reducing violence that wouldn't happen without it, then I can see the value as long as no slipping down the slippery slope happens. Though ... could it ever happen that content that was browsed without knowing what it was could ever trigger the system? PornHub just purged tons of videos because they thought there were too many underage videos. Could someone save a picture from a site like PornHub without knowing the person was actually underage, and then get flagged? Could someone merely browse the site and have an image end up in their cache that then gets flagged? I'm really skeptical of the idea that the mere presence of a downloaded file on someone's device without the context of how and why it was obtained, without evidence of the person's knowledge and intent, and without evidence of further redistribution should be enough to flag the owner to the authorities. Something I would not mind so much: if the browser detected a file as it was downloaded, blocked the download, and anonymously reported the website it was downloaded from. At least in this case, the entity being reported to the authorities would be the one that has demonstrated wrongdoing (more than mere possession) of redistributing an illegal thing.

    Also the potential for a slippery slope seems real. In the UK, an ISP blocklist system originally developed for child abuse images was extended to block knock-off watches. Could device-scanning be extended in the future for other stuff like pirated content or drawn sexualized images of fictional child characters (wish that was made up; it was a weird case but it doesn't seem like a big stretch for any imagery that someone has ever gone to jail for to end up in such a database)?

    7 votes
    1. stu2b50
      (edited )
      Link Parent
      iCloud Photos are not E2EE. I mean it can't be really, given the feature set. But it's definitely not, and as far as I know never advertised as such. This is right in the ToS for the service...

      iCloud Photos are not E2EE. I mean it can't be really, given the feature set. But it's definitely not, and as far as I know never advertised as such. This is right in the ToS for the service

      You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law.

      Could someone save a picture from a site like PornHub without knowing the person was actually underage, and then get flagged?

      Unless it's more extensive than what Apple says, no, because it gets fingerprinted on device when uploading to iCloud Photos.

      4 votes