24 votes

Apple’s decision to kill its CSAM photo-scanning tool sparks fresh controversy

4 comments

  1. [2]
    unkz
    Link
    I can only see this as good news. Apple listened to the numerous security experts who were, as I recall, virtually unanimous in condemning this as as a well intentioned plan but a security and...

    I can only see this as good news. Apple listened to the numerous security experts who were, as I recall, virtually unanimous in condemning this as as a well intentioned plan but a security and privacy nightmare.

    It’s also an interesting contrast to the Kiwi Farms case, where a tech company went the other direction in response to a similar weighing of value — the rights of harassment victims versus network infrastructure freedoms.

    26 votes
    1. vord
      Link Parent
      Agreed. There's a lot of parallels to be drawn. Technology, especially mass-deployed technology, is a powerful, powerful thing. Ending child abuse is near the top of every well-meaning person's...

      Agreed. There's a lot of parallels to be drawn. Technology, especially mass-deployed technology, is a powerful, powerful thing.

      Ending child abuse is near the top of every well-meaning person's social goals. Being more careful with your friends, neighbors, and community leaders will be 10x more impactful at eliminating child porn than any technological deployment.

      3 votes
  2. aditya
    Link
    This was debated quite a bit on Tildes back in 2021 when Apple first announced their plans to scan on devices.

    This was debated quite a bit on Tildes back in 2021 when Apple first announced their plans to scan on devices.

    Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.

    “Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

    “Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

    9 votes