Can contact tracing protect your privacy? Don’t count on it

Taylor Armerding
8 min readMay 4, 2020


Photo by Victor He on Unsplash

Once upon a time, the slogans of two of the tech world’s most gigantic giants were “Don’t Be Evil” (Google) and “It Just Works” (Apple).

Maybe those should be resurrected. Because what better guiding principles to create what the two companies have said they are rolling out: a platform for smartphones to do “contact tracing” of those who have been exposed to the coronavirus, while protecting personal privacy?

Google and Apple say they are creating an application programming interface (API) designed for apps to collect enough data to tell when users get close to anybody else, but also “with user privacy and security central to the design.” The rollout began last week.

There are multiple ongoing efforts to do the same thing, but as the Electronic Frontier Foundation (EFF) notes, given that Apple and Google are “an effective duopoly in the mobile operating system space, their plan carries special weight.”

The pitch has several components. By using Bluetooth, an app won’t collect location data. It will preserve anonymity, the companies say, by encrypting metadata and, every 10–20 minutes, generating a random key that isn’t tied to the identity of the smartphone. And the data collected will be decentralized — it will remain on individual devices and won’t become part of a massive database on a central server.

So far, some high-profile privacy advocates have offered some support, but with caveats. The Electronic Privacy Information Center (EPIC) said the initiative “could be an important first step, but the technological, legal, and policy impacts of such a system must be closely scrutinized.”

EFF said the Google/Apple model is “engineered to reduce the privacy risks of Bluetooth proximity tracking, and it’s preferable to other strategies that depend on a central server,” but still comes with “unavoidable privacy tradeoffs.”

An open letter from more than 300 academics in 27 countries said the use of Bluetooth in any contact-tracing program was “strongly preferred,” but warned of “mission creep” that could still allow “unprecedented surveillance of society at large.”

Theater, not substance

But to some experts, the caveats are the point. They say the privacy provisions, while not worthless, are more theater than substance.

Sammy Migues, principal scientist at Synopsys, said that while Bluetooth makes “deanonymization a little harder,” neither Bluetooth nor “any other kind of magic ‘trust me’ dust” will deliver real anonymity.”

“It’s not anonymous. Period,” he said. “Store cameras also record faces. The laser thermometer also has facial recognition technology. I’ll match that location data with red light cameras, toll road data, car GPS data, credit card transactions in the store, and so on. It’s anonymous to us but it’s not anonymous to those who can misuse the data.”

Ross Anderson, author and professor of security engineering at Cambridge University, in a post last month also said contact tracing in the real world isn’t anonymous. “COVID-19 is a notifiable disease so a doctor who diagnoses you must inform the public health authorities, and if they have the bandwidth they call you and ask who you’ve been in contact with. They then call your contacts in turn,” he said.

Beyond that are other, almost certain problems. The meaning of words like “minimal” when applied to the collection of personally identifiable information (PII) can get fuzzy. People can abuse just about any online platform. As Anderson notes, any voluntary app is “wide open to trolling. The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.”

Then there is the fact that while it may seem like it, not everybody has a smartphone. Millions don’t, and of those who do, perhaps millions more have older phones that won’t support Bluetooth proximity tracking. So coverage could be spotty.

Not to mention that there is no guarantee that when the current pandemic finally winds down and there is no longer a need for that kind of tracking, the system, and all the data, will simply be discarded — that it will be as if it had never existed. There is no guarantee that a government won’t build on a well-intentioned tool to stop the spread of a lethal disease and turn it into an intrusive surveillance tool.

Other contact-tracing apps are already being used for intrusive surveillance. It took an order by the Israeli Supreme Court to stop that government from enforcing quarantines by monitoring everybody’s phones so it could flag infected people who left home. And even that may return — the court order simply said it had to stop until there was legislation in place governing the practice.

Not new, but not at this scale

Not that contact tracing is new. It has been used for generations to map, and try to slow, the spread of contagions. But it has mostly been done manually. Even now, Massachusetts has about 1,000 people making phone calls to people who were reportedly close to anyone who has tested positive.

But compared to what is being proposed, manual tracing is haphazard, slow and incomplete. It depends on people remembering everywhere they went and who might have been less than six feet from them over the previous several days. How will they know who else was in grocery aisles 1–15? Or at the deli counter or the produce section? Or, they might not want to mention somebody they were close to.

A smartphone app is vastly more efficient and more comprehensive — it can collect interactions without the participants needing to remember a thing.

However, as noted, it also raises the specter of Big Brother. In some countries, that is the direction things seem to be heading. In the U.K., the initial version of a contact-tracing app expected to be rolled out in the next week or so won’t collect location data. But those who use it could be “asked” to “contribute” location data in future versions.

And as anyone who follows government knows, “ask” is almost always a euphemism. When politicians say they will “ask” the wealthy to pay more in taxes, they don’t mean it will be a request. It will be an order.

But according to Google, Apple and advocates of the pending platform, it doesn’t have to be that way. A FAQ from the two companies contends it is possible to get the benefits without invading the privacy of users.

Core privacy features

The core features of such a platform would include:

  • Bluetooth technology, which would not collect location data.
  • Anonymization: The system would use a “random, rotating identification scheme” for devices, designed to avoid collecting PII or identifying the device owner.
  • Decentralized data storage: The data would stay on users’ devices.
  • Voluntary: Users would have the choice to opt in or not. That means no coercion, such as “conditioning access to public benefits or services on consent to invasive surveillance,” in EPIC’s words.
  • Limited use: Contact tracing and nothing else. No extra features or services. No setting up an account. No use of data for analytics or advertising.
  • Limited time: Shut it down, completely, when the pandemic is over. No repurposing to track other diseases or to help law enforcement hunt for witnesses to a crime.

While such provisions are worthwhile and may bring some comfort to the app’s users, they come with some important caveats. Keep in mind that Google and Apple are building the API, not the app. It is possible that the governments in some countries will be less interested in privacy than others. It remains to be seen how much the API will control the privacy “features,” or lack of them, in those apps.

Also, just because the data storage is decentralized, that doesn’t mean only individual users will have access to it. The Google/Apple proposal makes numerous references to “public health authorities” having access. Which may be necessary, but it means the storage of the data will not be airtight.

Joshua Berry, associate principal security consultant at Synopsys, said participants “should understand that these applications could expose some details about when and where they have been in the recent past with other users of the system. Even if a contact tracing application does not collect and share GPS location data, this data could be shared with other people as part of the contact tracing process.”

Another reality is that Bluetooth is not foolproof. There are limits to its “intelligence.” Just one example is that it might log a “contact” between two people who are within six feet of each other, but in separate apartments separated by a wall. Or sitting in traffic in separate cars with the windows closed.

It will also miss things. If an infected person coughs on a surface at the grocery, and another person comes along a minute later and touches it, that won’t be logged as a contact.

Privacy requires security

Then there is security. If data is to be kept private, it has to be kept secure. And the EFF blog post lists a number of ways skilled hackers could defeat the “rolling proximity identifiers” (RPIDs) that are meant to keep devices anonymous.

One example: The current version of the platform offers no way to verify that the device sending an RPID is actually the one that generated it, so trolls could collect and then rebroadcast RPIDs as their own.

“Imagine a network of Bluetooth beacons set up on busy street corners that rebroadcast all the RPIDs they observe. Anyone who passes by a ‘bad’ beacon would log the RPIDs of everyone else who was near any one of the beacons. This would lead to a lot of false positives,” EFF said.

And Migues notes that even if the system is “shut down” once the crisis has passed, “it doesn’t matter because there’s no new technology here. All the invasive techniques being proposed have almost certainly existed, and been used, for months or years. This event is simply a way to legitimize their introduction into the mainstream. If we take them out of the mainstream, they’ll just go back to the shadows, smarter and better able to hide themselves after this field trial.”

Also, as is demonstrated daily, no app or API is invulnerable. Just this past week in the U.K., the Sheffield City Council’s automatic number-plate recognition (ANPR) system exposed 8.6 million records of vehicle trips made by thousands of people.

Surveillance Camera Commissioner Tony Porter described the security lapse as “both astonishing and worrying.” To which he could easily have added, “almost routine.”

And that may be one reason why, at least so far in the U.S., people are wary of contact tracing, no matter how well intentioned. A survey by the Washington Post and University of Maryland found that nearly 60% respondents said they would be unable or unwilling to participate. Experts say about 60% participation would be required for a system like this to work.

The reality, Migues said, is that “it will absolutely not be possible to prevent someone from eventually downloading all the data even if they have to do it one record at a time.”

“It’s never private and it’s not anonymous. Ever.”



Taylor Armerding

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.