This year may prove to be one in which the concepts of privacy vis-à-vis the government and private concerns may converge. In 2018, the United States Supreme Court ruled in Carpenter v. United States, 138 S. Ct. 2206 (2018), that individuals have an expectation of privacy in cell-tower locations, and consequently, the government must obtain a warrant to retrieve that location data from a carrier. The 5-4 decision held that cell tower data is subject to Fourth Amendment protections because it implicates an individual’s “legitimate expectation of privacy in the record of his physical movements.” The Court also noted that the data is “detailed, encyclopedic, and effortlessly compiled,” id. at 2216, and that functioning in modern society does not allow people to simply opt-out of using mobile devices:

In the first place, cell phones and the services they provide are “such a pervasive and insistent part of daily life” that carrying one is indispensable to participation in modern society. Second, a cell phone logs a cell-site record by dint of its operation, without any affirmative act on the part of the user beyond powering up. Virtually any activity on the phone generates CSLI [cell site location information], including incoming calls, texts, or e-mails and countless other data connections that a phone automatically makes when checking for news, weather, or social media updates. Apart from disconnecting the phone from the network, there is no way to avoid leaving behind a trail of location data. As a result, in no meaningful sense does the user voluntarily “assume[ ] the risk” of turning over a comprehensive dossier of his physical movements.

Id. at 2220) (internal citations omitted).

Although future Supreme Court decisions may alter or limit Carpenter, the decision’s effectiveness in protecting individual rights could be in more immediate danger where law enforcement is not issuing legal process to obtain personal information – but rather buying it off the open market. Two such mechanisms have come to light in the past few months, and both threaten spillover into civil liability for businesses.

Public Purchases of Private Data – Places

The first phenomenon impacting Carpenter is the federal government’s purchases – for several years–of cellphone location data for law enforcement. As the Wall Street Journal reported last month, federal agencies have purchased access to a virtual trove that charts the movements of millions of citizens’ cellphones from a company called Venntel. This practice was occurring as early as 2017, raising the question of whether this was a direct response to Carpenter. Venntel’s source of location data is supposedly anonymized location information of the type that analytics companies typically sell to advertisers, with consents usually given on an app-specific basis. According to the WSJ report, the Department of Homeland Security acknowledged purchases of $190,000 in data just in 2018, but the agency stuck to its story that the data was “anonymous.” According to the WSJ report, this location data was used in at least one immigration bust in San Luis, Arizona.

That “anonymous” data could be so useful to law enforcement strongly suggests that “anonymized” data may not remain anonymous forever. The New York Times published a special report on December 10 that concluded anonymous location data does not always remain anonymous; one user, for example, had her location tracked 8,600 times in four months, tracing her daily path repetitively between the same points. The report noted that the same patterns that make this data valuable for advertisement – when linked through a single ID or profile – can be combined with other data like public records, over a timeframe, as a means of identification.

This calls into question whether a business relationship between private concerns and the government can sidestep expectations of privacy, both under the Fourth Amendment and even civil statutes like the California Consumer Privacy Act (“CCPA”). Carpenter was decided on the basis that mobile phone use – and therefore sharing location via cell tower connections – is essentially nonconsensual. If (or likely, when) the Venntel situation comes under judicial review, a likely point of contention will be whether sometimes-vague disclosures in privacy policies of seemingly innocuous apps (weather or shopping, for example) are sufficient to serve as consent to share (identifiable) location information. Data aggregators like Venntel are collecting information from the types of activities identified in Carpenter: weather, social media, and news – and although the aggregation of data comes from multiple sources (apps), their combination can be just as “detailed, encyclopedic, and effortlessly compiled” as the cell tower data in Carpenter.

Even in civil litigation, the ability to reconstruct individual identities, probable individual identities, or household identities from “anonymous” information could become a focus under the CCPA, a statute whose touchstone for regulated personal information is whether the information in question “could reasonably be linked, directly, or indirectly with a particular consumer or household.” Cal. Civ. Code 1798.140(o)(1). The same reasonability of association applies to “de-identified” (anonymized) information under section 1798.140(h). Last year, Utah enacted a law that requires law enforcement agencies to obtain a warrant to obtain location information, among other things. Business would be well advised to examine their privacy disclosures as well as any underlying arrangements by which location data may be sold or transferred to third parties, particularly where the third parties reserve the right to use “anonymized” data for the third-party’s own purposes. Not all data can be made equally anonymous, and not every method of anonymization is equal. Aggregation of location data across multiple individuals and timeframes may be a solution, albeit one that could significantly diminish the usefulness of the data. Further, how data is aggregated is a fair question for service providers, since the method and extensiveness of the data set aggregated can affect the results.

Public Purchases of Private Data – Faces

The second story is related to public-private relationships. Clearview AI is a very-private private concern that collects massive quantities of facial-recognition data (reported to be 3 billion photos so far) and makes it available for sale to law enforcement (and to certain “special friends” in the civilian population). As the New York Times reported on January 18, Clearview AI’s activities recently came to light due to a massive data breach in which its entire client list was stolen. Clearview AI’s core business has not only raised the eyebrows of members of Congress, but it has also raised the ire of Google, Facebook, and Microsoft, who have sent the company cease-and-desist letters demanding that Clearview AI stop scraping pictures and data from their platforms.

Although this does not present as obvious a Fourth Amendment issue as Venntel does, the Clearview AI business model exploits publicly available web information that was only put on the open internet by virtue of consents to platforms (such as Facebook) that have no contractual privity with Clearview AI, and the users of the platforms that have been scraped have no privity with Clearview AI, either. The technology can be paired with augmented reality (AR) glasses that would allow the user to identify strangers in real-time. This represents a significant step up from the types of facial recognition technology previously available to law enforcement, which was limited to static images.

The type of information generated by the Clearview AI presents a world of possibilities and dangers: the faster resolution of police investigations, automated dystopian surveillance, improper use by stalkers, or intriguing uses by super-spies. Some jurisdictions, such as the City of San Francisco, have banned the real-time use of facial-recognition technology for law enforcement. The ACLU has argued that failure to disclose the “negatives” (rejected possibilities) used in facial recognition is a denial of the right to evidence favorable to the accused – noting that 28 members of Congress could be matched to pictures in mugshots. The proper and improper uses of facial recognition – like fingerprint and DNA evidence before it – will likely be a subject of constitutional litigation for some time.

But at a more basic level, enterprises that deal with photos may be unwittingly exposing their customers to data aggregators like Clearview AI, and businesses that purchase services like Clearview could find themselves regulated by – or sued under – the CCPA and the Illinois Biometric Information Privacy Act.

The Upshot

The law governing personal information in the United States is still developing rapidly, major data aggregators have broad service offerings, and emerging technologies are genuinely impressive in their ability to locate and identify people. Despite the fascinating promise, businesses should proceed with caution in developing data sharing and consumer disclosures. Data is increasingly collected, combined, and used in novel and untested ways, and seemingly distant activities of law enforcement could unexpectedly open private concerns to civil litigation.

For more information regarding this article, please contact Dante Stella.

For information regarding Dykema’s Privacy and Data Security Team, please contact Cindy Motley.

To sign up for Dykema’s Privacy and Data Security Blog e-mail updates, please click here.


As part of our service to you, we regularly compile short reports on new and interesting developments and the issues the developments raise. Please recognize that these reports do not constitute legal advice and that we do not attempt to cover all such developments. Rules of certain state supreme courts may consider this advertising and require us to advise you of such designation. Your comments are always welcome. ©2020 Dykema Gossett PLLC.