10 votes

iPhone keyboard for blind to shut down as maker cites Apple abuse of developers

1 comment

  1. [2]
    Comment deleted by author
    Link
    1. teaearlgraycold
      (edited )
      Link Parent
      I thought the CSAM scanning was exclusively done when uploading to iCloud. Having heard an explanation it seems like it’s less invasive than other cloud hosts. Apple isn’t scanning through your...

      I thought the CSAM scanning was exclusively done when uploading to iCloud. Having heard an explanation it seems like it’s less invasive than other cloud hosts. Apple isn’t scanning through your iCloud library on their servers. They’re just checking image fingerprints locally as part of the upload process.

      Apple wants maximum control (for maximum profits) so it’s to their advantage to create a system that doesn’t adapt well to state coercion.

      Honestly their system is shockingly, inappropriately, lax. They claim it would take 30 flagged uploads before any alarms are raised. Maybe the false positives are really high? Or worse - would there be too many true positives to handle with a lower threshold? The best answer I can think of is that they’re trying to lull offenders into a false sense of security. There’s no reason they couldn’t claim it takes 30 flagged uploads when really they’d look into each one.

      2 votes