Category: Privacy (Electronic Surveillance)


FAN (First Amendment News, Special Series) Newseum Institute to Host Event on Cell Phone Privacy vs National Security Controversy


Starting today and continuing through mid-June, I will post a special series of occasional blogs related to the Apple iPhone national security controversy and the ongoing debate surrounding it, even after the FBI gained access to the phone used by the terrorist gunman in the December shooting in San Bernardino, California.

Gene Policinski

Gene Policinski

This special series is done in conjunction with the Newseum Institute and a major program the Institute will host on June 15, 2016 in Washington, D.C.

I am pleased to be working with Gene Policinski (the chief operating officer of the Newseum Institute) and Nan Mooney (a D.C. lawyer and former law clerk to Chief Judge James Baker of the U.S. Court of Appeals for the Armed Forces) in organizing the event.

The June 15th event will be a moot court with seven Supreme Court Justices and two counsel for each side. The focus will be on the First Amendment issues raised in the case. (See below re links to the relevant legal documents).

→ Save the Date: Wednesday, June 15, 2016 @ 2:00 p.m., Newseum, Washington, D.C. (more info forthcoming).

The Apple-FBI clash was the first significant skirmish — and probably not much more than that — of the Digital Age conflicts we’re going to see in this century around First Amendment freedoms, privacy, data aggregation and use, and even the extent of religious liberty. As much as the eventual outcome, we need to get the tone right, from the start — freedom over simple fear. –Gene Policinski

Newseum Institute Moot Court Event

It remains a priority for the government to ensure that law enforcement can obtain crucial digital information to protect national security and public safety, either with cooperation from relevant parties, or through the court system when cooperation fails.Melanie Newman (spokeswoman for Justice Department, 3-28-16)

As of this date, the following people have kindly agreed to participate as Justices for a seven-member Court:

The following two lawyers have kindly agreed to serve as the counsel (2 of 4) who will argue the matter:

→ Two additional Counsel to be selected.  

Nan Mooney and I will say more about both the controversy and the upcoming event in the weeks ahead in a series of special editions of FAN. Meanwhile, below is some relevant information, which will be updated regularly.

Apple vs FBI Director James Comey

President Obama’s Statement

Congressional Hearing


Screen Shot 2016-03-17 at 10.46.11 PM

Last Court Hearing: 22 March 2016, before Judge Sheri Pym



News Stories & Op-Eds


  1. Pierre Thomas & Mike Levine, “How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple,” ABC News, March 29, 2016
  2. Bruce Schneier, “Your iPhone just got less secure. Blame the FBI,” Washington Post, March 29, 2016
  3. Katie Benner & Eric Lichtblau, “U.S. Says It Has Unlocked Phone Without Help From Apple,” New York Times, March 8, 2016
  4. John Markoff, Katie Benner & Brian Chen, “Apple Encryption Engineers, if Ordered to Unlock iPhone, Might Resist,” New York Times, March 17, 2016
  5. Jesse Jackson, “Apple Is on the Side of Civil Rights,” Time, March 17, 2016
  6. Katie Benner & Eric Lichtblau, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” New York Times, March 15, 2016
  7. Kim Zetter, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” Wired, March 15, 2016
  8. Alina Selyukh, “Apple On FBI iPhone Request: ‘The Founders Would Be Appalled,‘” NPR, March 15, 2016
  9. Howard Mintz, “Apple takes last shot at FBI’s case in iPhone battle,” San Jose Mercury News, March 15, 2016
  10. Russell Brandom & Colin Lecher, “Apple says the Justice Department is using the law as an ‘all-powerful magic wand‘,” The Verge, March 15, 2016
  11. Adam Segal & Alex Grigsby, “3 ways to break the Apple-FBI encryption deadlock,” Washington Post, March 14, 2016
  12. Seung Lee, “Former White House Official Says NSA Could Have Cracked Apple-FBI iPhone Already,” Newsweek, March 14, 2016
  13. Tim Bajarin, “The FBI’s Fight With Apple Could Backfire,” PC, March 14, 2016
  14. Alina Selyukh, “U.S. Attorneys Respond To Apple In Court, Call Privacy Concerns ‘A Diversion’,” NPR, March 10, 2016
  15. Dan Levine, “San Bernardino victims to oppose Apple on iPhone encryption,” Reuters, Feb. 22, 2016
  16. Apple, The FBI And iPhone Encryption: A Look At What’s At Stake,” NPR, Feb. 17, 2016

Unequal Exposure

Towards the end of the breathless and impassioned tour through privacy, surveillance, carcerality, and desire that is Exposed, Bernard Harcourt writes that “the emphasis on what we must do as ethical selves, each and every one of us – us digital subjects – may be precisely what is necessary for us to begin to think of ourselves as we. Yes, as that we that has been haunting this book since page one” (283). The call for unity and solidarity is seductive: if “we” are all exposed and vulnerable, then “we” can all resist and demand change. But that “we” – that reassuring abstraction of humanity and human experience – is not in fact what haunts this book. That “we” – unquestioned, undifferentiated, unmarked – is taken for granted and treated as the universal subject in this book. What truly haunts this book is everything that this “we” obscures and represses. Harcourt’s “we” is remarkably undifferentiated. Nearly every point Harcourt makes about how “we” experience digital subjectivity, surveillance, and exposure would and should be contested by women, people of color, the poor, sexual minorities (and those who belong to more than one of the categories in this non-exhaustive list). It is unfair, of course, to expect any one book or any one author to capture the full complexity of human experience on any topic. One writes about what one knows, and nuance must sometimes be sacrificed for the sake of broad theory. But there is a difference between falling short of conveying the diversity of human experience and barely acknowledging the existence of differentiation. If one of Harcourt’s goals is to lead us to “think of ourselves as we,” it is vital to recognize that  “we” in the digital age are not equally represented, equally consenting or resisting, or equally exposed.

Let’s begin with Harcourt’s characterization of the digital age as a study in shallow positivity: “We do not sing hate, we sing praise. We ‘like,’ we ‘share,’ we ‘favorite.’ We ‘follow.’ We ‘connect.’ ‘We get LinkedIn.’ Ever more options to join and like and appreciate. Everything today is organized around friending, clicking, retweeting, and reposting. … We are appalled by mean comments – which are censored if they are too offensive”(41). This is a picture of the digital world that will be  unrecognizable to many people. There is no mention of online mobs, targeted harassment campaigns, career-destroying defamation, rape and death threats, doxxing, revenge porn, sex trafficking, child porn, online communities dedicated to promoting sexual violence against women, or white supremacist sites. No mention, in short, of the intense, destructive, unrelenting hatred that drives so much of the activity of our connected world. Harcourt’s vision of our digital existence as a sunny safe space where occasional “mean comments” are quickly swept from view is nothing short of extraordinary.

Next, consider Harcourt’s repeated insistence that there are no real distinctions between exposer and exposed, the watcher and the watched: “There is no clean division between those who expose and those who surveil; surveillance of others has become commonplace today, with nude pictures of celebrities circulating as ‘trading fodder’ on the more popular anonymous online message boards, users stalking other users, and videos constantly being posted about other people’s mistakes, accidents, rants, foibles, and prejudices. We tell stories about ourselves and others. We expose ourselves. We watch others” (129). There are, in fact, important divisions between exposers and the exposed. With regard to sexual exposure, it is overwhelmingly the case that women are the subjects and not the agents of exposure. The nude photos to which Harcourt refers weren’t of just any celebrities; they were with few exceptions female celebrities. The hacker in that case, as in nearly every other case of nude photo hacking, is male, as is nearly every revenge porn site owner and the majority of revenge porn consumers. The “revenge porn” phenomenon itself, more accurately described as “nonconsensual pornography,” is overwhelmingly driven by men exposing women, not the other way around. Many of Harcourt’s own examples of surveillance point to the gender imbalance at work in sexual exposure. The LOVEINT scandal, the CCTV cameras pointed into girls’ toilets and changing rooms in UK schools (229), and Edward Snowden’s revelations of how the NSA employees share naked pictures (230) primarily involve men doing the looking and women and girls being looked at. The consequences of sexual exposure are also not gender-neutral: while men and boys may suffer embarrassment and shame, girls and women suffer these and much more, including being expelled from school, fired from jobs, tormented by unwanted sexual propositions, and threatened with rape.

There are also important distinctions to be made between those who voluntarily expose themselves and those who are exposed against their will. In the passage above, Harcourt puts nude photos in the same list as videos of people’s “rants, foibles, and prejudices.” The footnote to that sentence provides two specific examples: Jennifer Lawrence’s hacked photos and video of Michael Richards (Seinfeld’s Kramer) launching into a racist tirade as he performed at a comedy club (311). That is a disturbing false equivalence. The theft of private information is very different from a public, voluntary display of racist hatred. In addition to the fact that naked photos are in no way comparable to casual references to lynching and the repeated use of racial slurs, it should matter that Jennifer Lawrence was exposed against her will and Michael Richards exposed himself.

It’s not the only time in the book that Harcourt plays a bit fast and loose with the concepts of consent and voluntariness. In many places he criticizes “us” for freely contributing to our own destruction: “There is hardy any need for illicit or surreptitious searches, and there is little need to compel, to pressure, to strong-arm, or to intimidate, because so many of us are giving all our most intimate information and whereabouts so willingly and passionately – so voluntarily” (17).  And yet Harcourt also notes that in many cases, people do not know that they are being surveilled or do not feel that they have any practical means of resistance. “The truth is,” Harcourt tells us with regard to the first, “expository power functions best when those who are seen are not entirely conscious of it, or do not always remember. The marketing works best when the targets do not know that they are being watched” (124). On the second point, Harcourt observes that “when we flinch at the disclosure, most of us nevertheless proceed, feeling that we have no choice, not knowing how not to give our information, whom we would talk to, how to get the task done without the exposure. We feel we have no other option but to disclose” (181-2). But surely if people are unaware of a practice or feel they cannot resist it, they can hardly be considered to have voluntarily consented to it.

Also, if people often do not know that they are under surveillance, this undermines one of the more compelling concerns of the book, namely, that surveillance inhibits expression. It is difficult to see how surveillance could have an inhibiting effect if the subjects are not conscious of the fact that they are being watched. Surreptitious surveillance certainly creates its own harms, but if subjects are truly unaware that they are being watched – as opposed to not knowing exactly when or where surveillance is taking place but knowing that it is taking place somewhere somehow, which no doubt does create a chilling effect – then self-censorship is not likely to be one of them.

Harcourt suggests a different kind of harm when he tells us that “[i]nformation is more accessible when the subject forgets that she is being stalked” (124). That is, we are rendered more transparent to the watchers when we falsely believe they are not watching us. That seems right. But what exactly is the harm inflicted by this transparency? Harcourt warns that we are becoming “marketized subjects – or rather subject-objects who are nothing more than watched, tracked, followed, profiled at will, and who in turn do nothing more than watch and observe others” (26). While concerns about Big Data are certainly legitimate (and have been voiced by many scholars, lawyers, policymakers, and activists), Harcourt never paints a clear picture of what he thinks the actual harm of data brokers and targeted Target advertisements really is. In one of the few personal and specific examples he offers of the harms of surveillance, Harcourt describes the experience of being photographed by a security guard before a speaking engagement. Harcourt is clearly unsettled by the experience: “I could not resist. I did not resist. I could not challenge the security protocol. I was embarrassed to challenge it, so I gave in without any resistance. But it still bothers me today. Why? Because I had no control over the dissemination of my own identity, of my face. Because I felt like I had no power to challenge, to assert myself” (222). While one sympathizes with Harcourt’s sense of disempowerment, it is hard to know what to think of it in relation to the sea of other surveillance stories: women forced to flee their homes because of death threats, parents living in fear because the names of their children and the schools they attend have been published online, or teenaged girls committing suicide because the photo of their rape is being circulated on the Internet as a form of entertainment.

Harcourt uses the term “stalk” at least eight times in this book, and none of these references are to actual stalking, the kind that involves being followed by a particular individual who knows where you live and work and means you harm, the kind that one in six women in the U.S. will experience in her lifetime, the kind that is encouraged and facilitated by an ever-expanding industry of software, gadgets, and apps that openly market themselves to angry men as tools of control over the women who have slipped their grasp. What a privilege it is to be able to treat stalking not as a fact of daily existence, but as a metaphor.

Harcourt’s criticism of what he considers to be the Supreme Court’s lack of concern for privacy adds a fascinating gloss to all of this. Harcourt takes particular aim at Justice Scalia, asserting that even when Scalia seems to be protecting privacy, he is actually disparaging it: “Even in Kyllo v. United States…. where the Court finds that the use of heat-seeking technology constitutes a search because it infringes on the intimacies of the home, Justice Scalia mocks the humanist conception of privacy and autonomy.” The proof of this assertion supposedly comes from Scalia’s observation that the technology used in that case “might disclose, for example, at what hour each night the lady of the house takes her daily sauna and bath – a detail that many would consider ‘intimate.’” Harcourt assumes that Scalia’s reference to the “lady of the house” is an ironic expression of contempt. But Scalia is not being ironic. Elsewhere in the opinion, he emphatically states that “[i]n the home… all details are intimate details,” and many of his other opinions reinforce this view. Scalia and many members of the Court are very concerned about privacy precisely when it involves the lady of the house, or the homeowner subjected to the uninvited drug-sniffing dog on the porch (Florida v. Jardines, 2013), or the federal official subjected to the indignity of a drug test (Treasury Employees v. Von Raab, 1989 (dissent)). These same members of the Court, however, are remarkably unconcerned about privacy when it involves a wrongfully arrested man subjected to a humiliating “squat and cough” cavity search (Florence v. Burlington, 2012), or a driver searched after being racially profiled (Whren v. US, 1996), a pregnant woman tricked into a drug test while seeking prenatal care (Ferguson v. Charleston, 2001 (dissent)). In other words, the problem with the Supreme Court’s views on privacy and surveillance is not that it does not care about it; it’s that it tends to care about it only when it affects interests they share or people they resemble.

The world is full of people who do not have the luxury of worrying about a growing addiction to Candy Crush or whether Target knows they need diapers before they do. They are too busy worrying that their ex-husband will hunt them down and kill them, or that they will be stopped and subjected to a humiliating pat down for the fourth time that day, or that the most private and intimate details of their life will be put on public display by strangers looking to make a buck. These people are not driven by a desire to expose themselves. Rather, they are being driven into hiding, into obscurity, into an inhibited and chilled existence, by people who are trying to expose them. If “we” want to challenge surveillance and fight for privacy, “they” must be included.


The Fragility of Desire

In his excellent new book Exposed, Harcourt’s analysis of the role of desire in what he calls the “expository society” of the digital age is seductive. We are not characters in Orwell’s 1984, or prisoners of Bentham’s Panopticon, but rather are enthusiastic participants in a “mirrored glass pavilion” that is addictive and mesmerizing. Harcourt offers a detailed picture of this pavilion and also shows us the seamy side of our addiction to it. Recovery from this addiction, he argues, requires acts of disobedience but there lies the great dilemma and paradox of our age: revolution requires desire, not duty, but our desires are what have ensnared us.

I think that this is both a welcome contribution as well as a misleading diagnosis.

There have been many critiques of consent-based privacy regimes as enabling, rather than protecting, privacy. The underlying tenor of many of these critiques is that consent fails as a regulatory tool because it is too difficult to make it truly informed consent. Harcourt’s emphasis on desire shows why there is a deeper problem than this, that our participation in the platforms that surveil us is rooted in something deeper than misinformed choice. And this makes the “what to do?” question all the more difficult to answer. Even for those of us who see a stronger role for law than Harcourt outlines in this book (I agree with Ann Bartow’s comments on this) should pause here. Canada, for example, has strong private sector data protections laws with oversight from excellent provincial and federal privacy commissioners. And yet these laws are heavily consent-based. Such laws are able to shift practices to a stronger emphasis on things like opt-in consent, but Harcourt leaves us with a disquieting sense that this might just be just an example of a Pyrrhic victory, legitimizing surveillance through our attempts to regulate it because we still have not grappled with the more basic problem of the seduction of the mirrored glass pavilion.

The problem with Harcourt’s position is that, in exposing this aspect of the digital age in order to complicate our standard surveillance tropes, he risks ignoring other sources of complexity that are also important for both diagnosing the problem and outlining a path forward.

Desire is not always the reason that people participate in new technologies. As Ann Bartow and Olivier Sylvain point out, people do not always have a choice about their participation in the technologies that track us. The digital age is not an amusement park we can choose to go to or to boycott, but deeply integrated into our daily practices and needs, including the ways in which we work, bank, and access government services.

But even when we do actively choose to use these tools, it is not clear that desire captures the why of all such choices. If we willingly enter Harcourt’s mirrored glass pavilion, it is sometimes because of some of its very useful properties — the space- and time-bending nature of information technology. For example, Google calendar is incredibly convenient because multiple people can access shared calendars from multiple devices in multiple locations at different times making the coordination of calendars incredibly easy. This is not digital lust, but digital convenience.

These space- and time-bending properties of information technology are important for understanding the contours of the public/private nexus of surveillance that so characterizes our age. Harcourt does an excellent job at pointing out some of the salient features of this nexus, describing a “tentacular oligarchy” where private and public institutions are bound together in state-like “knots of power,” with individuals passing back and forth between these institutions. But what is strange in Harcourt’s account is that this tentacular oligarchy still appears to be bounded by the political borders of the US. It is within those borders that the state and the private sector have collapsed together.

What this account misses is the fact that information technology has helped to unleash a global private sector that is not bounded by state borders. In this emerging global private sector large multinational corporations often operate as “metanationals” or stateless entities. The commercial logic of information is that it should cross political borders with ease and be stored wherever it makes the most economic sense.

Consider some of the rhetoric surrounding the e-commerce chapter of the recent TPP agreement. The Office of the US Trade Representative indicates that one of its objectives is to keep the Internet “free and open” which it has pursued through rules that favour cross-border data flows and prevent data localization. It is easy to see how this idea of “free” might be confused with political freedom, for an activist in an oppressive regime is better off in exercising freedom of speech when that speech can cross political borders or the details of their communications can be stored in a location that is free of the reach of their state. A similar rationale has been offered by some in the current Apple encryption debate — encryption protects American business people communicating within China and we can see why that is important.

But this idea of freedom is the freedom of a participant in a global private sector with weak state control; freedom from the state control of oppressive regimes also involves freedom from the state protection of democratic regimes.

If metanationals pursue a state-free agenda, the state pursues an agenda of rights-protectionism. By rights protectionism I mean the claim that states do, and should, protect the constitutional rights of their own citizens and residents but not others. Consider, for example, a Canadian citizen who resides in Canada and uses US cloud computing. That person could be communicating entirely with other Canadians in other Canadian cities and yet have all of their data stored in the US-based cloud. If the US authorities wanted access to that data, the US constitution would not apply to regulate that access in a rights-protecting manner because the Canadian is a non-US person.

Many see result as flowing from the logic of the Verdugo-Urquidez case. Yet that case concerned a search that occurred in a foreign territory (Mexico), rather than within the US, where the law of that territory continued to apply. The Canadian constitution does not apply to acts of officials within the US. The data at issue falls into a constitutional black hole where no constitution applies (and maybe even international human rights black hole according to some US interpretations of extraterritorial obligations). States can then collect information within this black hole free of the usual liberal-democratic constraints and share it with other allies, a situation Snowden documented within the EU and likened to a “European bazaar” of surveillance.

Rights protectionism is not rights protection when information freely crosses political boundaries and state power piggybacks on top of this crossing and exploits it.

This is not a tentacular oligarchy operating within the boundaries of one state, but a series of global alliances – between allied states and between states and metanationals who exert state-like power — exploiting the weaknesses of state-bound law.

We are not in this situation simply because of a penchant for selfies. But to understand the full picture we do need to look beyond “ourselves” and get the global picture in view. We need to understand the ways in which our legal models fail to address these new realities and even help to mask and legitimize the problems of the digital age through tools and rhetoric that are no longer suitable.

Lisa Austin is an Associate Professor at the University of Toronto Faculty of Law.

Bernard Harcourt Exposed 02

Surveillance and Our Addiction to Exposure

Bernard Harcourt ExposedBernard Harcourt’s Exposed: Desire and Disobedience in the Digital Age (Harvard University Press 2015) is an indictment of  our contemporary age of surveillance and exposure — what Harcourt calls “the expository society.” Harcourt passionately deconstructs modern technology-infused society and explains its dark implications with an almost poetic eloquence.

Harcourt begins by critiquing the metaphor of George Orwell’s 1984 to describe the ills of our world today.  In my own previous work, I critiqued this metaphor, arguing that Kafka’s The Trial was a more apt metaphor to capture the powerlessness and vulnerability that people experience as government and businesses construct and use “digital dossiers” about their lives.  Harcourt critiques Orwell in a different manner, arguing that Orwell’s dystopian vision is inapt because it is too drab and gray:

No, we do not live in a drab Orwellian world.  We live in a beautiful, colorful, stimulating, digital world that is online, plugged in, wired, and Wi-Fi enabled.  A rich, bright, vibrant world full of passion and jouissance–and by means of which we reveal ourselves and make ourselves virtually transparent to surveillance.  In the end, Orwell’s novel is indeed prescient in many ways, but jarringly off on this one key point.  (pp. 52-53)

City wet-868078_960_720 pixabay b

Orwell’s Vision

City NYC new-117018_960_720 pixabay b

Life Today

Neil Postman Amusing Ourselves to DeathHarcourt notes that the “technologies that end up facilitating surveillance are the very technologies we crave.”  We desire them, but “we have become, slowly but surely, enslaved to them.” (p. 52).

Harcourt’s book reminds me of Neil Postman’s Amusing Ourselves to Death, originally published about 30 years ago — back in 1985.  Postman also critiqued Orwell’s metaphor and argued that Aldous Huxley’s Brave New World was a more apt metaphor to capture the problematic effects new media technologies were having on society.

Read More


How CalECPA Improves on its Federal Namesake

Last week, Governor Brown signed the landmark California Electronic Communications Privacy Act[1] (CalECPA) into law and updated California privacy law for modern communications. Compared to ECPA, CalECPA requires warrants, which are more restricted, for more investigations; provides more notice to targets; and furnishes as a remedy both court-ordered data deletion and statutory suppression.  Moreover, CalECPA’s approach is comprehensive and uniform, eschewing the often irrational distinctions that have made ECPA one of the most confusing and under-protective privacy statutes in the Internet era.

Extended Scope, Enhanced Protections, and Simplified Provisions

CalECPA regulates investigative methods that ECPA did not anticipate. Under CalECPA, government entities in California must obtain a warrant based on probable cause before they may access electronic communications contents and metadata from service providers or from devices.  ECPA makes no mention of device-stored data, even though law enforcement agents increasingly use StingRays to obtain information directly from cell phones. CalECPA subjects such techniques to its warrant requirement. While the Supreme Court’s recent decision in United States v. Riley required that agents either obtain a warrant or rely on an exception to the warrant requirement to search a cell phone incident to arrest, CalECPA requires a warrant for physical access to any device, not just a cell phone, which “stores, generates, or transmits electronic information in electronic form.” CalECPA clearly defines the exceptions to the warrant requirement by specifying what counts as an emergency, who can give consent to the search of a device, and related questions.

ECPA’s 1986-drafted text only arguably covers the compelled disclosure of location data stored by a service provider, and does not clearly require a warrant for such investigations. CalECPA explicitly includes location data in the “electronic communication information” that is subject to the warrant requirement when a government entity accesses it from either a device or a service provider (broadly defined).  ECPA makes no mention of location data gathered in real-time or prospectively, but CalECPA requires a warrant both for those investigations and for stored data investigations. Whenever a government entity compels the “the production of or access to” location information, including GPS data, from a service provider or from a device, CalECPA requires a warrant.

Read More

Privacy Security Novels 02

5 Great Novels About Privacy and Security

I am a lover of literature (I teach a class in law and literature), and I also love privacy and security, so I thought I’d list some of my favorite novels about privacy and security.

I’m also trying to compile a more comprehensive list of literary works about privacy and security, and I welcome your suggestions.

Without further ado, my list:

Franz Kafka, The Trial

Kafka’s The Trial begins with a man being arrested but not told why. In typical Kafka fashion, the novel begins badly for the protagonist . . . and then it gets worse! A clandestine court system has compiled a dossier about him and officials are making decisions about him, but he is left in the dark. This is akin to how Big Data can operate today. The Trial captures the sense of helplessness, frustration, and powerlessness when large institutions with inscrutable purposes use personal data and deny people the right to participate. I wrote more extensively about how Kafka is an apt metaphor for privacy in our times in a book called The Digital Person about 10 years ago.

Franz Kafka The Trial


Read More

The Black Box Society: Interviews

My book, The Black Box Society, is finally out! In addition to the interview Lawrence Joseph conducted in the fall, I’ve been fortunate to complete some radio and magazine interviews on the book. They include:

New Books in Law

Stanford Center for Internet & Society: Hearsay Culture

Canadian Broadcasting Corporation: The Spark

Texas Public Radio: The Source

WNYC: Brian Lehrer Show.

Fleishman-Hillard’s True.

I hope to be back to posting soon, on some of the constitutional and politico-economic themes in the book.


On Privacy, Free Speech, & Related Matters – Richard Posner vs David Cole & Others

I’m exaggerating a little, but I think privacy is primarily wanted by people because they want to conceal information to fool others. Richard Posner

Privacy is overratedRichard Posner (2013)

 Much of what passes for the name of privacy is really just trying to conceal the disreputable parts of your conduct. Privacy is mainly about trying to improve your social and business opportunities by concealing the sorts of bad activities that would cause other people not to want to deal with you.Richard Posner (2014)

This is the seventh installment in the “Posner on Posner” series of posts on Seventh Circuit Judge Richard Posner. The first installment can be found here, the second here, the third here, the fourth here, the fifth here, and the sixth one here.

Privacy has been on Richard Posner’s mind for more than three-and-a-half decades. His views, as evidenced by the epigraph quotes above, have sparked debate in a variety of quarters, both academic and policy. In some ways those views seem oddly consistent with his persona – on the one hand, he is a very public man as revealed by his many writings, while on the other hand, he is a very private man about whom we know little of his life outside of the law save for a New Yorker piece on him thirteen years ago.

On the scholarly side of the privacy divide, his writings include:

  1. The Right of Privacy,” 12 Georgia Law Review 393 (1978)
  2. Privacy, Secrecy, and Reputation,” 28 Buffalo Law Review 1 (1979)
  3. The Uncertain Protection of Privacy by the Supreme Court,” 1979 Supreme Court Review 173
  4. The Economics of Privacy,” 71 The American Economic Review 405 (1981)
  5. Privacy,” Big Think (video clip, nd)
  6. Privacy is Overrated,” New York Daily News, April 28, 2014

For a sampling of Judge Posner’s opinion on privacy, go here (and search Privacy)

(Note: Some links will only open in Firefox or Chrome.)


Privacy – “What’s the big deal?”

Privacy interests should really have very little weight when you’re talking about national security. The world is in an extremely turbulent state – very dangerous. — Richard Posner (2014)

Recently, Georgetown Law Center held a conference entitled “Cybercrime 2020: The Future of Online Crime and Investigations” (full C-SPAN video here). In the course of that event, Judge Posner joined with others in government, private industry, and in the legal academy to discuss privacy, the Fourth Amendment, and free speech, among other things. A portion of the exchange between Judge Posner and Georgetown law professor David Cole was captured on video.

Judge Richard Posner

Judge Richard Posner

Scene: The Judge sitting in his office, speaking into a video conference camera — As he rubbed his fingers across the page and looked down, Posner began: “I was thinking, listening to Professor Cole, what exactly is the information that he’s worried about?” Posner paused, as if to setup his next point: “I have a cell phone – iPhone 6 – so if someone drained my cell phone, they would find a picture of my cat [laughter], some phone numbers, some e-mail addresses, some e-mail texts – so what’s the big deal?”

He then glanced up from the text he appeared to be reading and spoke with a grin: “Other people must have really exciting stuff. [laughter] Could they narrate their adulteries or something like that?” [laughter] He then waved his hands in the air before posing a question to the Georgetown Professor.

“What is it that you’re worrying about?” Posner asked as if truly puzzled.

At that point, Cole leaned into his microphone and looked up at the video screen bearing the Judge’s image next to case reports on his left and the American flag on his right.

Cole: “That’s a great question, Judge Posner.”

Professor Cole continued, adding his own humor to the mix: “And I, like you, have only pictures of cats on my phone. [laughter] And I’m not worried about anything from myself, but I’m worried for others.”

On a more substantive note, Cole added: “Your question, which goes back to your original statement, . . . value[s] . . . privacy unless you have something to hide. That is a very, very shortsighted way of thinking about the value [of privacy]. I agree with Michael Dreeben: Privacy is critical to a democracy; it is critical to political freedom; [and] it is critical to intimacy.”

The sex video hypothetical

And then with a sparkle in his spectacled eye, Cole stated: “Your question brings to mind a cartoon that was in the New Yorker, just in the last couple of issues, where a couple is sitting in bed and they have video surveillance cameras over each one of them trained down on the bed [Cole holds his hands above his head to illustrate the peering cameras]. And the wife says to the husband: ‘What are you worried about if you’ve got nothing to hide, you’ve got nothing to fear.’”

Using the cartoon as his conceptual springboard, Cole moved on to his main point: “It seems to me that all of us, whether we are engaged in entirely cat-loving behavior, or whether we are going to psychiatrists, or abortion providers, or rape crises centers, or Alcoholics Anonymous, or have an affair – all of us have something to hide. Even if you don’t have anything to hide, if you live a life that could be entirely transparent to the rest of the world, I still think the value of that life would be significantly diminished if it had to be transparent.”

Without missing a beat, Cole circled back to his video theme: “Again you could say, ‘if you’ve got nothing to hide, and you’re not engaged in criminal activity, let’s put video cameras in every person’s bedroom. And let’s just record the video, 24/7, in their bedroom. And we won’t look at it until we have reason to look at it. You shouldn’t be concerned because . . .’”

At this point, Posner interrupted: “Look, that’s a silly argument.”

Cole: “But it’s based on a New Yorker cartoon.”

The Judge was a tad miffed; he waved his right hand up and down in a dismissive way: “The sex video, that’s silly!Waving his index finger to emphasize his point, he added: “What you should be saying, [what] you should be worried about [are] the types of revelation[s] of private conduct [that] discourage people from doing constructive things. You mentioned Alcoholics Anonymous . . .”

Cole: “I find sex to be a constructive thing.”

Obviously frustrated, Posner raised his palms up high in protest: “Let me finish, will you please?”

Cole: “Sure.”

Posner: “Look, that was a good example, right? Because you can have a person who has an alcohol problem, and so he goes to Alcoholics Anonymous, but he doesn’t want this to be known. If he can’t protect that secret,” Posner continued while pointing, “then he’s not going to go to Alcoholics Anonymous. That’s gonna be bad. That’s the sort of thing you should be concerned about rather than with sex videos. . . . [The Alcoholics Anonymous example] is a good example of the kind of privacy that should be protected.”

David Cole

Professor David Cole

Privacy & Politics 

Meanwhile, the audience listened and watched on with its attention now fixed on the Georgetown professor.

Cole: “Well, let me give you an example of sex privacy. I think we all have an interest in keeping our sex lives private. That’s why we close doors into our bedroom, etc. I think that’s a legitimate interest, and it’s a legitimate concern. And it’s not because you have something wrong you want to hide, but because intimacy requires privacy, number one. And number two: think about the government’s use of sex information with respect to Dr. Martin Luther King. They investigated him, intruded on his privacy by bugging his hotel rooms to learn [about his] affair, and then sought to use that – and the threat of disclosing that affair – to change his behavior. Why? Because he was an active, political, dissident fighting for justice.”

“We have a history of that,” he added. “Our country has a history of that; most countries have a history of that; and that’s another reason the government will use information – that doesn’t necessarily concern [it] – to target people who [it is] concerned about . . . – not just because of their alcohol problem [or] not just because of their sexual proclivities – but because they have political views and political ideas that the government doesn’t approve of.”

At this point the moderator invited the Judge to respond.

Posner: “What happened to cell phones? Do you have sex photos on your cell phones?”

Cole: “I imagine if Dr. Martin Luther King was having an affair in 2014, as opposed to the 1960s, his cell phone, his smart phone, would have quite a bit of evidence that would lead the government to that affair. He’d have call logs; he might have texts; he might have e-mails – all of that would be on the phone.”

The discussion then moved onto the other panelists.

Afterwards, and writing on the Volokh Conspiracy blog, Professor Orin Kerr, who was one of the participants in the conference, summed up his views of the exchange this way:

“I score this Cole 1, Posner 0.”

The First Amendment — Enter Glenn Greenwald Read More


The Flawed Foundations of Article III Standing in Surveillance Cases (Part IV)

In my first three posts, I’ve opened a critical discussion of Article III standing for plaintiffs challenging government surveillance programs by introducing the 1972 Supreme Court case of Laird v. Tatum. In today’s post, I’ll examine the Court’s decision itself, which held that chilling effects arising “merely from the individual’s knowledge” of likely government surveillance did not constitute adequate injury to meet Article III standing requirements.

The Burger Court

It didn’t take long for courts to embrace Laird as a useful tool to dismiss cases where plaintiffs sought to challenge government surveillance programs, especially where the complaints rested on a First Amendment chill from political profiling by law enforcement. Some judges took exception to a broad interpretation of Laird, but objections largely showed up in dissenting opinions. For the most part, early interpretations of Laird sympathized with the government’s view of surveillance claims.

Read More


The Flawed Foundations of Article III Standing in Surveillance Cases (Part I)

I’m grateful for the opportunity to be a Concurring Opinions guest blogger this month. My posts will largely concentrate on the history of Article III standing for plaintiffs seeking to challenge government surveillance programs, and the flawed foundations upon which our federal standing jurisprudence rests. 


Then-Secretary of Defense Melvin Laird Sharing a Light Moment With President Nixon

Then-Secretary of Defense Melvin Laird Sharing a Light Moment With President Nixon (Wikimedia Commons)

Plaintiffs seeking to challenge government surveillance programs have faced long odds in federal courts, due mainly to a line of Supreme Court cases that have set a very high bar to Article III standing in these cases. The origins of this jurisprudence can be directly traced to Laird v. Tatum, a 1972 case where the Supreme Court considered the question of who could sue the government over a surveillance program, holding in a 5-4 decision that chilling effects arising “merely from the individual’s knowledge” of likely government surveillance did not constitute adequate injury to meet Article III standing requirements. Federal courts have since relied upon Laird to deny standing to plaintiffs in surveillance cases, including the 2013 Supreme Court decision in Clapper v. Amnesty Int’l USA. But the facts behind Laird illuminate a number of important reasons why it is a weak basis for surveillance standing doctrine. It is therefore a worthwhile endeavor, I think, to reexamine Laird in a post-Snowden context in order to gain a deeper understanding of the Court’s flawed standing doctrine in surveillance cases.

Read More