13 votes

Google's new app will help warn you about nude images in Messages

10 comments

  1. [6]
    arch
    Link
    I honestly don't know how I feel about this. My only interaction with Google and their nudity filters was when Photos automatically backed up a picture of my kid playing in a bathtub (waist up).

    I honestly don't know how I feel about this. My only interaction with Google and their nudity filters was when Photos automatically backed up a picture of my kid playing in a bathtub (waist up).

    11 votes
    1. [5]
      goose
      Link Parent
      Assuming it's done on-device, it doesn't bother me so much, although I also am not the target demographic for this feature. I have neither taken, nor received, any pictures that would be affected...

      Assuming it's done on-device, it doesn't bother me so much, although I also am not the target demographic for this feature. I have neither taken, nor received, any pictures that would be affected by this.

      Out of curiosity, what was your interaction/experience with your aforementioned photo? I probably have a few of the same pics of my kids as infants/toddlers, but I've never noticed anything different about them in my photo library.

      1. [3]
        JXM
        Link Parent
        Even if it's processed on device, the results could still be sent to Google (or anyone else). In 2022, the New York Times ran an article about someone who was investigated by law enforcement...

        Even if it's processed on device, the results could still be sent to Google (or anyone else).

        In 2022, the New York Times ran an article about someone who was investigated by law enforcement because of Google's scanning.

        14 votes
        1. pallas
          Link Parent
          What was perhaps more horrifying about that article than just the criminal investigation was Google seemed to suggest that its view was above that of law enforcement. The police investigated,...

          What was perhaps more horrifying about that article than just the criminal investigation was Google seemed to suggest that its view was above that of law enforcement. The police investigated, determined the situations were not crimes (and thus presumably not CSAM), and closed the investigations. Yet, when presented with the cases, by the NY Times, and asked for comments, Google not only didn't back down, or, as might be expected, say they couldn't comment on the cases: their spokesperson disagreed, gave private details of the users' content suggesting they thought the images were CSAM and were not as described, and said Google stood by their decisions to permanently ban the users and deny access to any of their content. That the police disagreed did not matter. That the NY Times was writing an article with details about the cases did not matter. Google's scanning, in Google's view, was right.

          13 votes
        2. goose
          Link Parent
          That's an interesting article, thanks for linking. What I'm referring to is the "on-device" processing the Tensor chip has allowed, and that over time, more features seemingly have been moved from...

          That's an interesting article, thanks for linking.

          What I'm referring to is the "on-device" processing the Tensor chip has allowed, and that over time, more features seemingly have been moved from cloud computing to on device computing of Google's Android devices. More specifically, given that the Messages app is separate from the Google Photos app, the engines behind them are fundamentally different. In the article you linked, it seems that the photo/video in question came to Google's presence of mind through the cloud backup feature of Google Photos.

          It's definitely a bit of a double edged sword. On the one hand, who wouldn't support stronger protections against exploitation of minors? But on the other, the invasion of privacy can't be negated simply in the name of "big brother keeping an eye on things". It's also no secret that Google scans files saved/shared in their Drive service for illegal or copyrighted content. While there's certainly an invasion of privacy argument to be made, I have a difficult time weighing that over the fact that it's invasion of a service by a service provider whose infrastructure is providing that service. People should be entitled to privacy, but also I can see why Google would want to be entitled to see what kind of data they're trafficking. That isn't to say they're totally objective or the single source of truth, there's more than enough examples out their of making what many would describe as "the wrong call". But ultimately there is no one answer or point of view that's going to be absolutely correct here, it's very much a gradient requiring nuance and a multifaceted approach.

          In this specific instance, I see the bigger problem being that Google is a giant nameless faceless organization that can do whatever they want with a service that people depend on in their day to day life. It almost makes me think it should be treated more like a utility, and rules should be put in place about gaining/losing accessing to that utility, in the year of 2024 when so many people depend so heavily on internet based services for daily life.

          6 votes
      2. arch
        Link Parent
        If I'm remembering correctly (it was maybe 4 or more years ago) they blacked out and/or deleted the picture, and flagged it as potentially harmful, in way that insinuated the potential for child porn.

        If I'm remembering correctly (it was maybe 4 or more years ago) they blacked out and/or deleted the picture, and flagged it as potentially harmful, in way that insinuated the potential for child porn.

        3 votes
  2. [3]
    ahatlikethat
    Link
    I never trust Google, that's just my way. I looked at their those links and read: Understand data collection & data sharing Data collection Click to expand spoiler. Developers do not need to...

    I never trust Google, that's just my way. I looked at their those links and read:
    Understand data collection & data sharing
    Data collection

    Click to expand spoiler.

    Developers do not need to disclose data accessed by an app as "collected" in the Data safety section if:

    An app accesses the data only on your device and it is not sent off your device. For example, if you provide an app permission to access your location, but it only uses that data to provide app functionality on your device and does not send it to its server, it does not need to disclose that data as collected.
    Your data is sent off the device but only processed ephemerally. This means the developer accesses and uses your data only when it is stored in memory, and retains the data for no longer than necessary to service a specific request. For example, if a weather app sends your location off your device to get the current weather at your location, but the app only uses your location data in memory and does not store the data for longer than necessary to provide the weather.
    Your data is sent using end-to-end encryption. This means the data is unreadable by anyone other than the sender and recipient. For example, if you send a message to a friend using a messaging app with end-to-end encryption, only you and your friend can read the message.
    

    So they can say they are not collecting data, even if they are collecting data, as long as they use end-to-end encryption?

    7 votes
    1. [2]
      Gummy
      Link Parent
      This reads more like they can't hold onto the information for longer than it takes to provide the related service. If they're actually storing the data long term or logging anything wouldn't that...

      This reads more like they can't hold onto the information for longer than it takes to provide the related service. If they're actually storing the data long term or logging anything wouldn't that be outside the scope of what is being said here?

      6 votes
      1. ahatlikethat
        Link Parent
        Well there is no indication that ALL of those criteria have to be met, so I am surmising that they are stating that any one of those conditions being met is sufficient.

        Well there is no indication that ALL of those criteria have to be met, so I am surmising that they are stating that any one of those conditions being met is sufficient.

        3 votes
  3. Tiraon
    Link
    Android system SafetyCore, an ai powered content analyzing(at least) system app to become available. Seems it will be opt-out for minors, opt-in for others, how will they tell? Additional links...

    Android system SafetyCore, an ai powered content analyzing(at least) system app to become available. Seems it will be opt-out for minors, opt-in for others, how will they tell?

    Additional links https://developers.google.com/android/binary_transparency/google1p/overview, https://play.google.com/store/apps/details?id=com.google.android.safetycore

    6 votes