Category: Privacy (Electronic Surveillance)


The Vanishing Distinction Between Real-time and Historical Location Data

A congressional inquiry, which recently revealed that cell phone carriers disclose a huge amount of subscriber information to the government, has increased the concern that Big Brother tracks our cell phones. The New York Times reported that, in 2011, carriers responded to 1.3 million law enforcement demands for cell phone subscriber information, including text messages and location information. Because each request can acquire information on multiple people, law enforcement agencies have clearly obtained such information about many more of us than could possibly be worthy of suspicion. Representative Markey, who spearheaded the inquiry, has followed up with a thorough letter to Attorney General Holder that asks how the Justice Department could possibly protect privacy and civil liberties while acquiring such a massive amount of information.

Among many important questions, Representative Markey’s letter asks whether the DOJ continues to legally differentiate between historical (those produced from carrier records) and real-time (those produced after an order is issued) cell site location information and what legal standard the DOJ meets for each (or both). Traditionally, courts have accorded less protection to historical location data, which I have criticized as a matter of Fourth Amendment law in my amicus briefs and in my scholarship. The government’s applications for historical data in the Fifth Circuit case, which is currently considering whether agents seeking historical location data must obtain a warrant, provide additional evidence that the distinction between real-time and historical location data makes no sense.

Some background. Under the current legal rules for location acquisition by law enforcement, which are complex, confusing, and contested, law enforcement agents have generally been permitted to acquire historical location data without establishing probable cause and obtaining a warrant. Instead, they have had to demonstrate that the records are relevant to a law enforcement investigation, which can dramatically widen the scope of an inquiry beyond those actually suspected of criminal activity and yield the large number of disclosures that the recent congressional inquiry revealed. Generally, prospective (real-time) location information has required a higher standard, often a warrant based on probable cause, which has made it more burdensome to acquire and therefore more protected against excessive disclosure.

Some commentators and judges have questioned whether historical location data should be available on an easier to satisfy standard, positing the hypothetical that law enforcement agents could wait just a short amount of time for real-time information to become a record, and then request it under the lower standard. Doing so would clearly be an end run around both the applicable statute (ECPA) and the Fourth Amendment, which arguably accord less protection to historical information because it is stored as an ordinary business record and not because of the fortuity that it is stored for a short period of time.

It turns out that this hypothetical is more than just the product of concerned people’s imagination. The three applications in the Fifth Circuit case requested that stored records be created on an ongoing basis. For example, just after a paragraph that requests “historical cell-site information… for the sixty (60) days prior” to the order, one application requests “For the Target Device, after receipt and storage, records of other information… provided to the United States on a continuous basis contemporaneous with” the start or end of a call, or during a call if that information is available. The other two applications clarify that “after receipt and storage” is “intended to ensure that the information” requested “is first captured and recorded by the provider before being sent.” In other words, the government is asking the carrier to create stored records and then send them on as soon as they are stored.

To be clear, only one of the three applications applied for only a relevance-based court order to obtain the continuously-created stored data. That court order, used for historical data, has never been deemed sufficient for forward-looking data (as the continuously-created data would surely be as it would be generated after the order). The other two applications used a standard less than probable cause but more than just a relevance order. It is not clear if the request for forward-looking data under the historical standard was an inadvertent mistake or an attempt to mislead. But applications in other cases have much more clearly asked for forward-looking prospective data, and didn’t require that data to be momentarily stored. Why would the applications in this case request temporary storage if not at least to encourage the judge considering the application to grant it on a lower standard?

I am optimistic that the DOJ’s response to Representative Markey’s letter will yield important information about current DOJ practices and will further spur reform. In the meantime, the government’s current practice of using this intrusive tool to gather too much information about too many people cries out for formal legal restraint. Congress should enact a law requiring a warrant based on probable cause for all location data. It should not codify a meaningless distinction between historical and real-time data that further confuses judges and encourages manipulative behavior by the government.


Social Media and Chat Monitoring

Suppose a system could help alert people to online sexual predators? Many might like that. But suppose that same system could allow people to look for gun purchasers, government critics, activists of any sort; what would we say then? The tension between these possibilities is before us. Mashable reports that Facebook and other platforms are now monitoring chats to see whether criminal activity is suspected. The article focuses on the child predator use case. Words are scanned for danger signals. Then “The software pays more attention to chats between users who don’t already have a well-established connection on the site and whose profile data indicate something may be wrong, such as a wide age gap. The scanning program is also ‘smart’ — it’s taught to keep an eye out for certain phrases found in the previously obtained chat records from criminals including sexual predators.” After a flag is raised a person decides whether to notify police. The other uses of such a system are not discussed in the article. Yet again, we smash our heads against the speech, security, privacy walls. I expect some protests and some support for the move. Blood may spill on old battlegrounds. Nonetheless, I think that the problems the practice creates merit the fight. The privacy harms and the speech harms mean that even if there are small “false positives” in the sexual predator realm, why a company gets to decide to notify police, how the system might be co-opted for other uses, and the affect on people’s ability to talk online should be sorted as social platforms start to implement monitoring systems.


Lend me your ears, no really. I need them to ID you.

Researcher Mark Nixon at the University of Southampton “believes that using photos of individual ears matched against a comparative database could be as distinctive a form of identification as fingerprints.”

According to the University’s news site the claim is that: “Using ears for identification has clear advantages over other kinds of biometric identification, as, once developed, the ear changes little throughout a person’s life. This provides a cradle-to-grave method of identification.”

Ok so they are not taking ears. The method involves cameras, scans, and techniques you may know about from facial recognition. This article has a little more detail. As an A.I. system it probably is pretty cool. Still, it sounds so odd that I wonder whether this work has considered the whole piercing, large gauge trend. I can imagine security that now requires removing ear decorations regardless of what they are made of. Also if really used for less invasive ID, will wearing earmuffs be cause to think someone is hiding or should we remember that folks get cold. For the sci-fi inclined, bet that a movie will entail cutting off an ear for identification just like past films have involved cutting off fingers and hands to fake an identity.


Big Data Brokers as Fiduciaries

In a piece entitled “You for Sale,” Sunday’s New York Times raised important concerns about the data broker industry.  Let us add some more perils and seek to reframe the debate about how to regulate Big Data.

Data brokers like Acxiom (and countless others) collect and mine a mind-boggling array of data about us, including Social Security numbers, property records, public-health data, criminal justice sources, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, online musings, browsing habits culled by behavioral advertisers, and the gold mine of drug- and food-store records.  They scrape our social network activity, which with a little mining can reveal our undisclosed sexual preferences, religious affiliations, political views, and other sensitive information.  They may integrate video footage of our offline shopping.  With the help of facial-recognition software, data mining algorithms factor into our dossiers the over-the-counter medicines we pick up, the books we browse, and the pesticides we contemplate buying for our backyards.  Our social media influence scores may make their way into the mix.  Companies, such as Klout, measure our social media influence, usually on a scale from one to 100.  They use variables like the number of our social media followers, frequency of updates, and number of likes, retweets, and shares.  What’s being tracked and analyzed about our online and offline behavior is accelerating – with no sign of slowing down and no assured way to find out.

As the Times piece notes, businesses buy data-broker dossiers to classify those consumers worth pursuing and those worth ignoring (so-called “waste”).  More often those already in an advantaged position get better deals and gifts while the less advantaged get nothing.  The Times piece rightly raised concerns about the growing inequality that such use of Big Data produces.  But far more is at stake.

Government is a major client for data brokers.  More than 70 fusion centers mine data-broker dossiers to detect crimes, “threats,” and “hazards.”  Individuals are routinely flagged as “threats.”  Such classifications make their way into the “information-sharing environment,” with access provided to local, state, and federal agencies as well as private-sector partners.  Troublingly, data-broker dossiers have no quality assurance.  They may include incomplete, misleading, and false data.  Let’s suppose a data broker has amassed a profile on Leslie McCann.  Social media scraped, information compiled, and videos scanned about “Leslie McCann” might include information about jazz artist “Les McCann” as well as information about criminal with a similar name and age.  Inaccurate Big Data has led to individuals’ erroneous inclusion on watch lists, denial of immigration applications, and loss of public benefits.  Read More


Cybersecurity Legislation and the Privacy and Civil Liberties Oversight Board

Along with a lot of other privacy folks, I have a lot of concerns about the cybersecurity legislation moving through Congress.  I had an op-ed in The Hill yesterday going through some of the concerns, notably the problems with the over broad  “information sharing” provisions.

Writing the op-ed, though, prompted me to highlight one positive step that should happen in the course of the cybersecurity debate.  The Privacy and Civil Liberties Oversight Board was designed in large part to address information sharing.  This past Wednesday, the Senate Judiciary Committee had the hearing to consider the bipartisan slate of five nominees.

Here’s the point.  The debate on CISPA and other cybersecurity legislation has highlighted all the information sharing that is going on already and that may be going on in the near future.  The PCLOB is the institution designed to oversee problems with information sharing.  So let’s confirm the nominees and get the PCLOB up and running as soon as possible.

The quality of the nominees is very high.  David Medine, nominated to be Chair, helped develop the FTC’s privacy approach in the 1990’s and has worked on privacy compliance since, so he knows what should be done and what is doable.  Jim Dempsey has been at the Center of Democracy and Technology for over 15 years, and is a world-class expert on government, privacy, and civil liberties.  Pat Wald is the former Chief Judge of the DC Circuit.  Her remarkably distinguished career includes major experience on international human rights issues.  I don’t have experience with the other two nominees, but the hearing exposed no red flags for any of them.

The debates about cybersecurity legislation show the centrality of information sharing to how government will respond to cyber-threats.  So we should have the institution in place to make sure that the information sharing is done in a lawful and sensible way, to be effective and also to protect privacy and civil liberties.


Stanford Law Review Online: The Dead Past

Stanford Law Review

The Stanford Law Review Online has just published Chief Judge Alex Kozinski’s Keynote from our 2012 Symposium, The Dead Past. Chief Judge Kozinski discusses the privacy implications of our increasingly digitized world and our role as a society in shaping the law:

I must start out with a confession: When it comes to technology, I’m what you might call a troglodyte. I don’t own a Kindle or an iPad or an iPhone or a Blackberry. I don’t have an avatar or even voicemail. I don’t text.

I don’t reject technology altogether: I do have a typewriter—an electric one, with a ball. But I do think that technology can be a dangerous thing because it changes the way we do things and the way we think about things; and sometimes it changes our own perception of who we are and what we’re about. And by the time we realize it, we find we’re living in a different world with different assumptions about such fundamental things as property and privacy and dignity. And by then, it’s too late to turn back the clock.

He concludes:

Judges, legislators and law enforcement officials live in the real world. The opinions they write, the legislation they pass, the intrusions they dare engage in—all of these reflect an explicit or implicit judgment about the degree of privacy we can reasonably expect by living in our society. In a world where employers monitor the computer communications of their employees, law enforcement officers find it easy to demand that internet service providers give up information on the web-browsing habits of their subscribers. In a world where people post up-to-the-minute location information through Facebook Places or Foursquare, the police may feel justified in attaching a GPS to your car. In a world where people tweet about their sexual experiences and eager thousands read about them the morning after, it may well be reasonable for law enforcement, in pursuit of terrorists and criminals, to spy with high-powered binoculars through people’s bedroom windows or put concealed cameras in public restrooms. In a world where you can listen to people shouting lurid descriptions of their gall-bladder operations into their cell phones, it may well be reasonable to ask telephone companies or even doctors for access to their customer records. If we the people don’t consider our own privacy terribly valuable, we cannot count on government—with its many legitimate worries about law-breaking and security—to guard it for us.

Which is to say that the concerns that have been raised about the erosion of our right to privacy are, indeed, legitimate, but misdirected. The danger here is not Big Brother; the government, and especially Congress, have been commendably restrained, all things considered. The danger comes from a different source altogether. In the immortal words of Pogo: “We have met the enemy and he is us.”

Read the full article, The Dead Past by Alex Kozinski, at the Stanford Law Review Online.


Fifth Circuit Considers Constitutionality of Cell Site Location Data

Department of Justice litigators just filed a reply brief in an exciting but complex case in the Fifth Circuit that concerns law enforcement access to cell site location data.  As amicus curiae, I hope to deepen readers’ understanding of the basic issues in the case and also to provide some insider’s insights.  This blog post will furnish the background that later postings will draw upon.

The litigation began when Magistrate Judge Smith rejected three government applications for cell site location data that did not purport to satisfy probable cause.  I highly recommend Judge Smith’s thoughtful opinion that holds that agents must obtain a warrant to compel service providers to disclose a target subscriber’s stored records of cell phone location data.  Justice Department lawyers appealed Judge Smith’s denial, as well as the District Court’s order that agreed with Judge Smith, because they claim the right to compel disclosure whenever they satisfy the “relevance standard” under 18 U.S.C. § 2703(d) (“D order”).

My amicus brief argues that the Fourth Amendment requires a probable cause warrant for all location data, which is similar to the argument in EPIC’s amicus briefEFF and ACLU made that argument as well, and they also suggested that the Fifth Circuit could find that the Stored Communications Act gives magistrate judges the discretion to require either a warrant or a D order.  EFF and ACLU previously advocated the discretionary approach in the Third Circuit, and the Third Circuit recently adopted it in the only federal appellate decision on the matter.   Orin Kerr’s amicus brief argued that magistrate judges lack the authority to deny government applications on the grounds of unconstitutionality.

The case’s importance derives from the lack of appellate guidance on law enforcement acquisition of cell site location data, which has become commonplace, according to the ACLU’s recent release of numerous public records.  The ACLU’s report reveals a wide array of procedures, with some practices clearly lacking appropriate protections against misuse.  Congress currently sits on bills that would clarify the standards; one would require a warrant for access to all location data; another would require a warrant only for prospective location data and not for stored data.

In future postings I will discuss how low procedural hurdles as well as a lack of notice and transparency make location data acquisition a threat to civil liberties.  I will also discuss the continued use of arbitrary distinctions (such as between historical and prospective data) that unduly complicates the law and limits privacy protections.  I will argue, for example, that the Supreme Court’s Jones case concerning GPS tracking in real-time governs historical location data.  With luck, I will even shed some light on just what location data is.  Please stay tuned.


Ravi Trial Verdict for Invading the Privacy of Clementi

Dharun Ravi was found guilty of invasion of privacy when he used a webcam to watch and broadcast online Clementi’s intimate activities with another man in their shared dorm room.  From CNN:

A former Rutgers University student accused of spying on and intimidating his gay roommate by use of a hidden webcam was found guilty on all counts, including invasion of privacy and the more severe charges of bias intimidation, in a case that thrust cyberbullying into the national spotlight.

Dharun Ravi, 20, could now face up to 10 years in jail and deportation to his native India. He was also found guilty of witness tampering, hindering apprehension and tampering of physical evidence.

The jury was confronted with a series of questions on each charge. Though it found Ravi not guilty on several questions within the verdict sheet, because he was found guilty on at least one question on each main count, he could now face the maximum penalty.

From ABC News:

A New Jersey jury today found former Rutgers student Dharun Ravi guilty on all counts for using a webcam to spy on his roommate, Tyler Clementi, having a gay sexual encounter in 2010.

Ravi, 20, was convicted of invasion of privacy, bias intimidation, witness tampering and hindering arrest, stemming from his role in activating the webcam to peek at Clementi’s date with a man in the dorm room on Sept. 19, 2010. Ravi was also convicted of encouraging others to spy during a second date, on Sept. 21, 2010, and intimidating Clementi for being gay.

Ravi was found not guilty of some subparts of the 15 counts of bias intimidation, attempted invasion of privacy, and attempted bias intimidation, but needed only to be found guilty of one part of each count to be convicted.

I blogged about this case here and here and here.

Here is New Jersey’s invasion of privacy statute:

Read More

Symposium on Configuring the Networked Self: Cohen’s Methodological Contributions

Julie Cohen’s extraordinarily illuminating book Configuring the Networked Self makes fundamental contributions to the field of law and technology. In this post, I’d like to focus on methodology and theory (a central concern of Chapters 1 to 4). In another post, I hope to turn to the question of realizing Cohen’s vision of human flourishing (a topic Chapters 9 and 10 address most directly).

Discussions of rights and utility dominate the intellectual property and privacy literatures.* Cohen argues that their appeal can be more rhetorical than substantive. As she has stated:

[T]he purported advantage of rights theories and economic theories is neither precisely that they are normative nor precisely that they are scientific, but that they do normative work in a scientific way. Their normative heft derives from a small number of formal principles and purports to concern questions that are a step or two removed from the particular question of policy to be decided. . . . These theories manifest a quasi-scientific neutrality as to copyright law that consists precisely in the high degree of abstraction with which they facilitate thinking about processes of cultural transmission.

Cohen notes “copyright scholars’ aversion to the complexities of cultural theory, which persistently violates those principles.” But she feels they should embrace it, given that it offers “account[s] of the nature and development of knowledge that [are] both far more robust and far more nuanced than anything that liberal political philosophy has to offer. . . . [particularly in understanding] how existing knowledge systems have evolved, and how they are encoded and enforced.”

A term like “knowledge system” may itself seem very abstract and formal. But Cohen’s work insists on a capacious view of network-enabled forms of knowing. Rather than naturalizing and accepting as given the limits of copyright and privacy law on the dissemination of knowledge, she can subsume them into a much broader framework of understanding where “knowing” is going. That framework includes cultural practices, norms, economics, and bureaucratic processes, as well as law.
Read More


Stanford Law Review Online: The Privacy Paradox 2012 Symposium Issue

Stanford Law Review

Our 2012 Symposium Issue, The Privacy Paradox: Privacy and Its Conflicting Values, is now available online:


The text of Chief Judge Alex Kozinski’s keynote is forthcoming.