32 votes

The FBI’s new tactic: Catching American suspects with push alerts

6 comments

  1. skybrian
    Link
    From the article: … … … …

    From the article:

    A foreign law enforcement officer got TeleGuard [an encrypted Swiss messaging app] to hand over a small string of code the company had used to send push alerts — the pop-up notifications that announce instant messages and news updates — to the suspect’s phone.

    An FBI agent then got Google to quickly hand over a list of email addresses this month linked to that code, known as a “push token,” and traced one account to a man in Toledo, an affidavit shows. The man, Michael Aspinwall, was charged with sexual exploitation of minors and distribution of child pornography and arrested within a week of the Google request.

    The data has become prized evidence for federal investigators, who have used push tokens in at least four cases across the country to arrest suspects in cases related to child sexual abuse material and a kidnapping that led to murder, according to a Washington Post review of court records. And law enforcement officials have defended the technique by saying they use court-authorized legal processes that give officers a vital tool they need to hunt down criminals.

    Joshua Stueve, a spokesman for the Justice Department, said, “After determining that non-content push notification metadata may help arrest offenders or stop ongoing criminal conduct, federal law enforcement investigators fully comply with the U.S. Constitution and applicable statutes to obtain the data from private companies.”

    The Post found more than 130 search warrants and court orders in which investigators had demanded that Apple, Google, Facebook and other tech companies hand over data related to a suspect’s push alerts or in which they noted the importance of push tokens in broader requests for account information.

    Unlike normal app notifications, push alerts, as their name suggests, have the power to jolt a phone awake — a feature that makes them useful for the urgent pings of everyday use. Many apps offer push-alert functionality because it gives users a fast, battery-saving way to stay updated, and few users think twice before turning them on.

    But to send that notification, Apple and Google require the apps to first create a token that tells the company how to find a user’s device. Those tokens are then saved on Apple’s and Google’s servers, out of the users’ reach.

    In a June hearing in the case, Assistant U.S. Attorney Christopher Rawsthorne cited the push-notification data as a critical way of identifying the defendant.

    “It used to be that Wickr was something where it was impossible to figure out the identity … of the person using it,” Rawsthorne said. “And it’s only recently been that we’ve been able to figure it out.”

    Google has said it requires court orders to hand over the push-related data. Apple said in December that it, too, would start requiring court orders, a change from its previous policy of requiring only a subpoena, which police and federal investigators can issue without a judge’s approval.

    But in three of the four cases reviewed by The Post, Apple and Google handed over the data without a court order — probably as a result of the requests being made on an emergency, expedited or exigent basis, which the companies fulfill under different standards when police claim a threat of imminent harm.

    17 votes
  2. [4]
    balooga
    Link
    It’s not clear to me from the article if push tokens are automatically associated with all user accounts on iOS and Android devices, or if they only get assigned on transmission of a push...

    It’s not clear to me from the article if push tokens are automatically associated with all user accounts on iOS and Android devices, or if they only get assigned on transmission of a push notification. In other words, is everyone susceptible to this method of deanonymization, or can it be prevented by disabling push notifs on your phone?

    16 votes
    1. rkcr
      Link Parent
      Android developer here (so I can't speak to iOS at all). Also, it's been many years since I've implemented push notifications so I could be a little fuzzy (aka wrong) here. tl;dr - everyone is...

      Android developer here (so I can't speak to iOS at all). Also, it's been many years since I've implemented push notifications so I could be a little fuzzy (aka wrong) here.

      tl;dr - everyone is susceptible, disabling push notifications will likely not help.


      There's a point of confusion here: the word "notification" is overloaded.

      The "notification" users think of is the little message that pops up while using the phone. Apps determine when those show up, and they can be disabled at will. However, in this context, a "push notification" is a low-energy way of communicating between the server and the device.

      While UI notifications and push notifications are often linked, there's no reason they have to be! The app can show a UI notification without having received a push notification, and a push notification can be received without the app showing a UI notification.


      With that clarified, let's dig into push notifications.

      Android uses Firebase Cloud Messaging as the basis of all push notifications. The lifecycle flow shows push notifications are a two-step process: first you register your device, then you can send/receive messages.

      As noted above, this whole process works separately from UI notifications. So even if your notifications are disabled, the app may still register your device, and might even run the logic to try to send a notification - you just won't see it. Why? Because apps might be using push notifications for more than just UI notifications - for example, a push notification could tell an app that there's new data to sync in the background, so that the next time you open your app it has fresh data.

      If the app only uses push notifications to show UI notifications, then a privacy-focused app developer could prevent FCM from initializing (and thus registering) your device. However, that's extra work and I doubt most developers would go that extra mile; the main reason developers would do this work is in cases where the same app is delivered in contexts where FCM is not available (e.g. Amazon Kindle).


      One last point: I don't think the individual apps can use the push notification token they receive to figure out your identity. However, since Google knows all the push notification tokens, they could figure out that the same device that uses push notifications for app XYZ is also the same device that has push notifications for Gmail (and thus can find the user's identity that way).

      10 votes
    2. slambast
      (edited )
      Link Parent
      iOS is the one I have familiarity with, so I can answer that at least: when you give an app permission to send push notifications, that app is then allowed to get a push token for that user....

      iOS is the one I have familiarity with, so I can answer that at least: when you give an app permission to send push notifications, that app is then allowed to get a push token for that user. Generally, the app then sends that to a server somewhere that handles notifications and talks to APNs. So AFAIK, if you don't give push notification permissions for an app, this de-anonymization won't work.

      (Honestly, it's news to me that all apps get the same push token—I thought it was unique per-app)

      Edit: At least for iOS, push tokens are unique per-app. See here:

      This address takes the form of a device token unique to both the device and your app.

      So this seems slightly more involved than just looking for string matches.

      Edit 2: The existence of provisional authorization for trial notifications seems like this attack might not be mitigated by simply not allowing notifications. Unsure how those work.

      5 votes
    3. FarraigePlaisteach
      Link Parent
      I couldn’t figure this out either and I would really like to know. If disabling notifications increases privacy then it’s something that privacy groups I’ve haven’t advocated for yet. At least,...

      I couldn’t figure this out either and I would really like to know. If disabling notifications increases privacy then it’s something that privacy groups I’ve haven’t advocated for yet. At least, not for this reason.

      3 votes