On the privacy of push notifications

On the privacy of push notifications

In December 2023, Reuters, TechCrunch, 404 Media, and others reported on the surveillance of Apple and Google users through push notifications. We were involved in the original investigation by netzpolitik.org. These findings have been confused in various places with an older privacy issue. In this blog post, we want to examine the problem and address potential misconceptions.

What is all this about?

Push notifications are a mechanism through which applications can send and display notifications to users of smartphones. Such notifications can include receiving a new email or a message from a messaging app. The user interface of push notifications is well known to any smartphone user – the infrastructure that drives them in the background is, however, a complex mechanism and not without privacy issues.

Push notifications are delivered via Apple Push Notification service (APNs) on Apple devices, or via Google’s Firebase Cloud Messaging (FCM) or Huawei Mobile Services (HMS) on Android devices. To use these push notifications, applications must register a push token with the respective platform. The tokens are an identifier that references the installation of an application on a specific device. They are used as an identifier to route messaging from the internet to a device.

Push notifications come with two distinct privacy problems:

  1. The existing problem: Cleartext content of push notifications
    The content of push notifications is fully visible to platforms like Google or Apple.
  2. The new problem: The linkability of push tokens
    There is overwhelming evidence that push tokens are linked to a Google/Huawei account or an Apple ID. The platforms can therefore tie those to user accounts used in applications.

Cleartext content of push notifications in detail

The content of push notifications is fully visible to platforms like Google or Apple and it is a deliberate design choice. While the problem is not new, it is luckily relatively easy to address: 

In essence, it is enough to synchronize secret key material between the application server and the user device over TLS and to subsequently use this key material to encrypt the content of push tokens. By design, the key material is not accessible to push token platforms like Google or Apple, and therefore the content remains protected. In addition, application developers can minimize the amount of data shared over push notifications in the first place. 

To summarize, this issue is not new, it has been addressed by a few applications and it is not the focus of this blog post.

The linkability of push tokens in detail

To show notifications to users, applications request push tokens from either Google’s, Huawei’s, or Apple’s push notification service. Applications require the push token whenever a notification is to be displayed on a specific user device and must therefore hold on to that token for as long as users use the application.

The easiest way to do so is to store the push token on the application server as part of a user account. This way, whenever a notification is to be sent out, the application server looks up the user’s profile, finds the associated push token, and uses it to route the push notification to the user’s device through the platform’s push notification service.

Associating push tokens with a user account seems necessary for functionality, but it implies that it becomes part of the user data that can be requested in the context of a legal warrant.

Because the push token is also associated with either a Google/Huawei account or an Apple ID on the respective platform, a subsequent legal request can now result in a wealth of information about a particular user. 

Even when a user never provided any identifying information to the application, their anonymity is compromised by the push token nonetheless.

What apps are affected?

In principle, the issue affects all applications that use push notifications as provided by either the Google Play Services on Android or iOS. The issue is, however, particularly salient with social networks, especially secure messengers. When those apps promise they collect very little data about their users and users don’t volunteer any data that makes them identifiable, such as

  • using a throwaway number for Signal,
  • picking a completely random username for Wire/Matrix,
  • or being given a random username with Threema/Session,

they do not realize that push tokens can potentially deanonymize them.

What users are affected?

The issue affects users who have a legitimate need for anonymity. Especially at-risk users, such as journalists, political dissidents, persecuted minorities, activists, or human rights defenders, often rely on staying anonymous.

Depending on what threat model is relevant to them, users make a decision about which messaging service is the right one. For some, this choice is vital. The lack of transparency and awareness regarding the risk of deanonymization is highly problematic.

What can Google, Huawei, and Apple do?

The way Google, Huawei, and Apple handle push notifications has yet to be fully known, but it looks like there's a strong connection between push tokens and user accounts, like Google/Huawei accounts or Apple IDs.

Having such a strong connection might be efficient for handling high numbers of notifications, but it constitutes a fundamental privacy concern. As of now, there's no conclusive evidence indicating that Google and Apple are incapable of transitioning to a more privacy-centric alternative. In fact, simple techniques, like using cryptographic hash functions, can create one-way links that can’t be traced back. 

The main goal is straightforward yet crucial: In scenarios where Google or Apple are compelled to divulge information, by legal warrants or otherwise, their systems should be architecturally incapable of linking push tokens to specific user data. This goal fits well with their other privacy efforts, like encrypting Android messages and iCloud backups.

What can app developers do?

In general, app developers cannot easily address the push token privacy issue themselves. At the time of writing, we are also not aware of any app that deployed a technology that works with iOS users and Android users who use Google Play Services alike.

At a fundamental level, only the platforms Google, Huawei, and Apple can address the push token privacy issue. For their Android users who don’t use Google Play Services, developers can offer alternatives like UnifiedPush.

We did, however, develop a solution that can be used in certain use cases, particularly with secure messengers. On a high level, it works as follows:

  • Push tokens are only stored in encrypted form on the application servers
  • The key to decrypt them is stored on client devices (users’ phones and computers)
  • The key is sent from the client to the server whenever a push notification is sent out and the server discards the key right after the notification is sent

This solution puts application developers in a position where they cannot reply to legal requests about push tokens any longer since the servers only hold on to the decryption key for what is typically a fraction of a second.

This approach, however, has two prerequisites:

  • Push notifications can only be sent out when the action is triggered on a user’s device. This is not a very limiting restriction for messengers, because usually push notifications are sent out whenever a user sends a message or initiates a call. Edge cases like the initial connection between two users are subject to this limitation.
  • There needs to be a key management system that involves layers of key wrapping and a practical group key agreement mechanism.

We designed a prototype to ascertain the practical viability of the solution for messengers that use Messaging Layer Security (MLS) as an encryption protocol. In our solution, MLS serves as the key negotiation mechanism to easily obtain group keys in group chats, since MLS excels at that. We will keep working on this solution and look forward to continuing exchanging with the community.

What can users do?

In general, users don’t have any way to mitigate the privacy issue when they are users of either iOS or Android with Google Play Services installed (which is the default for most Android distributions).

There is no evidence that suggests the privacy issue with push tokens is mitigated when users refuse to allow push notifications when prompted by an application. For completeness, it is worth mentioning that certain Android distributions avoid bundling Google Play Services with the operating system and leave it to users to install them. These distributions, however, don’t have a large market share.

Conclusion

The privacy guarantees of push notifications are misaligned with other decisions previously taken by both Google and Apple. The connection between push tokens and user accounts is a serious privacy issue that should be addressed. The best way to do this would be for Google, Huawei, and Apple to improve their push notification infrastructure. Until then, app developers that provide apps that are being used by at-risk users should attempt to mitigate the risk or at least be more transparent about it.