Recently, Apple and Google – two of the world’s biggest tech firms–jointly devised a system of contact tracing for COVID-19. This contact tracing does not involve analyzing centralized data stores of personal data. Rather, it leverages a proximity technology most often seen in retail stores and shopping centers plus a peer-to-peer (P2P) communications concept that parallels methods explored for connected vehicles. The Apple-Google design is a fascinating departure from the conventional model of central collection and processing of personal data.

Coincidence… or Bluetooth?

You may have encountered mobile applications that have asked for Bluetooth access. Or you may have received what seems like a strangely coincidental promotional email as you have walked through the door of a store. This is not a coincidence; retailers frequently use Bluetooth, among other methods, to determine where a customer is standing in a store and to trigger promotions. This is not regulated in most of the United States. We normally think of Bluetooth as a way that a “master” device (a computer, car, or audio source, typically) can communicate with an “accessory” such as keyboards, mice, headphones, hands-free sets, etc. As most users encounter the technology, it is a matter of “pairing” one device with another. But Bluetooth can run under numerous profiles that transmit a variety of data types. GPS-free location tracking was largely enabled by Bluetooth LE, which allows the radio technology to run on a mobile device without creating an excessive battery drain. This eliminated a major inconvenience of prior versions of Bluetooth, and the practical effect is that it can remain “on” all the time. Many implementations of Bluetooth 4.0/LE allow range-finding between a transmitter and receiver. A store, for example, can determine where a customer is standing by measuring the distances from the visitor’s phone to sensors in the store.

Contact Tracing – Proximity to the Problem

Many state governors have made effective contact tracing a prerequisite for transition from a COVID-19 lockdown to a more normal public life. Contact tracing has been widely used in Asia for tracking exposures to the coronavirus. The process determines, by proximity, contacts between infected and uninfected people. This can be done in many ways, the most primitive one being mapping networks by hand.

The problem in some jurisdictions that have been hardest hit by COVID-19 (such as the United States and Europe) is that techniques used in Asia, including massive trolling of mobile device data at points of concentration such as carriers, skip disclosures and consent – and go right to processing non-anonymized personal information. An example of such information is the absolute location of a user at a point in time as derived by device GPS or cell tower triangulation. These techniques would be difficult for private parties to implement under the EU General Data Protection Regulation or the California Consumer Privacy Act; if done by the government in the United States, they could implicate Fourth Amendment concerns; and they could be a hard sell in cultures that put a high premium on freedom from surveillance.

Apple and Google recently found a workaround: eliminate the analysis of absolute location information derived from GPS and use P2P communications to track exposure. This resulted in a method of determining exposure and warning of exposures without:

(a) accumulating personal information that is regulated,

(b) creating a data store that would be attractive for law enforcement, or

(c) divulging the identity or health status of people who are nearby.

The solution turns out to be roughly analogous to the children’s card game “Go Fish,” in which neither player reveals to the other what is in their hand, and only when there is intentional action and a match is information (a card) handed over. Even then, the Apple/Google solution goes a step further: delays and anonymization inhibit a player from knowing which game in a day-long tournament resulted in a match–or even who the relevant opponent was. It just tells the player there was a match.

Based on Apple’s published specifications, the system works like this (and this is a simplification; there are multiple encryption and decryption stages in this process):

  • Person A and Person B each have a phone app that transmits via Bluetooth rolling, anonymized proximity identifiers (“keys”), at a recommended interval of 4-5 times a second (200-270ms), to nearby phones. The keys change every 15 minutes (or, on some devices, at a random interval from 10-20 minutes). If Persons A and B come within a preset distance threshold of each other, then Person A’s phone records the keys it receives from Person B’s phone and vice-versa. This creates a datastore on each phone of keys sent and keys received.
  • If Person A is later diagnosed with the coronavirus, they can voluntarily authorize the app on their phone to transmit the last 14 days of their outbound keys to a “diagnosis server.”
  • Person B’s phone periodically downloads information from the diagnosis server that can be broken out into individual keys and determines whether any match a key it previously received (such as that from Person A’s phone). If there is a match, Person B receives a warning message about having been exposed–but is not told it was Person A.

Anonymized data is kept on decentralized servers for 14 days, which in theory minimizes the impact of data breaches and discovery by law enforcement. The system will first be rolled out as an API in May for health care authorities and then as a service in the iOS and Android operating systems. It is unclear whether Apple or Google would provide any servers that would store data.

What does it mean for privacy law?

The Apple-Google solution is elegant.  In its purest state, no personal information is ever actually collected by a diagnosis server. The only collected information is essentially just a string of keys that the server receives and then transmits back down to devices using the app. Matching is done on the participating device. Because data is de-identified before it ever leaves the device of a person who self-reports an infection, the data is not subject to GDPR and CCPA data protections. It is clear from Apple’s cryptography white paper that it would be difficult or impossible for law enforcement (or anyone) to make much use of the information. And it might be attractive to users who are worried that COVID-19 is a stepping stone to a government panopticon and would otherwise avoid participation.

But would it remain pure? It would be important in implementation that diagnosis servers capture only the keys of users who self-report and not any other type of data that could identify a user or a household. Further, as the New York Times has reported, it is becoming increasingly easy to aggregate disparate data sources to determine the identity of an individual. For example, if an outbound packet from an app to a diagnosis server could be detected by a carrier during self-reporting, it might not be a stretch to figure out which subscriber had just self-reported. Even though this requires combining data sources, it makes careful implementation extremely important. Statutes such as CCPA define de-identified (anonymous) information as that which “cannot reasonably identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.” Cal. Civ. Code. 1798.140(h). Even if something is compliant today, technological developments may make it non-compliant in the future.

There are practical aspects of anonymity as well. If a person who receives a warning was in proximity to a very small number of people, and known people, the solution works from a legal standpoint but might not meet the privacy expectations of a self-reporting user. The API is capable of generating proximity reporting, such as the time spent in proximity and the time at which the contact occurred. Depending on how this is reported by the app (the time states that it “may have reduced precision, such as within one day of the actual time”), this may help the recipient of a warning to narrow down the list of “suspects.”

Aside from privacy law, there are practical considerations such as participation rate, false reporting, and unexpectedly long incubation periods. It is also not clear how effective the system would be if it were delivering millions of “positive” keys to every participating device every day. That said, in a public health crisis, the perfect cannot always be the enemy of the good.

Conclusion

The collision of COVID-19 and western privacy expectations has presented some unique challenges for contact tracing. Real-time solutions, though preferable in an avoidance-of-infection use case, are not practical where a virus can be spread by asymptomatic people who have not been diagnosed. The Apple-Google solution is retrospective and helps users identify exposures they might not have been aware of. That said, the tool is currently being published for the use of third-party developers who could implement the solution in ways that could preserve the privacy inherent in the solution–or degrade it.

For more information regarding this article, please contact Dante Stella.

For information regarding Dykema’s Privacy and Data Security Team, please contact Cindy Motley.

To sign up for Dykema’s Privacy and Data Security Blog e-mail updates, please click here.


As part of our service to you, we regularly compile short reports on new and interesting developments and the issues the developments raise. Please recognize that these reports do not constitute legal advice and that we do not attempt to cover all such developments. Rules of certain state supreme courts may consider this advertising and require us to advise you of such designation. Your comments are always welcome. ©2020 Dykema Gossett PLLC.