Tagged: Privacy

2

FAN (First Amendment News, Special Series #2) FBI to Continue Working with Hackers to Fight Terrorism . . . & Crime?

images

The F.B.I. defended its hiring of a third party to break into an iPhone used by a gunman in last year’s San Bernardino, Calif., mass shooting, telling some skeptical lawmakers on Tuesday that it needed to join with partners in the rarefied world of for-profit hackers as technology companies increasingly resist their demands for consumer information. — New York Times, April 19, 2016

__________________

This is the second FAN installment concerning the ongoing controversy over national security and cell-phone privacy. As with the first installment, the legal focus here is on First Amendment issues. It is against that backdrop that the Newseum Institute in Washington, D.C. will host a public event on June 15, 2016.

I am pleased to be working with Gene Policinski (the chief operating officer of the Newseum Institute) and Nan Mooney (a D.C. lawyer and former law clerk to Chief Judge James Baker of the U.S. Court of Appeals for the Armed Forces) in organizing the event.

Information concerning that upcoming event is set out below, but first a few news items.

Recent News Items

“FBI Director James Comey said the U.S. paid more than he will make in salary over the rest of his term to secure a hacking tool to break into a mobile phone used by a dead terrorist in the San Bernardino . . . . The law enforcement agency paid ‘more than I will make in the remainder of this job, which is 7 years and 4 months,’ Comey said . . . at the Aspen Security Forum in London. . . . Comey’s pay this year is $185,100, according to federal salary tables, indicating the tool cost the agency more than $1.3 million. FBI directors are appointed to 10-year terms.”

“[Ms. Amy Hess, the Federal Bureau of Investigation’s executive assistant director for science and technology,] did not answer directly when asked about whether there were ethical issues in using third-party hackers but said the bureau needed to review its operation ‘to make sure that we identify the risks and benefits.’ The F.B.I. has been unwilling to say whom it paid to demonstrate a way around the iPhone’s internal defenses, or how much, and it has not shown Apple the technique.”

“Bruce Sewell, Apple’s general counsel, told a House commerce oversight subcommittee that the company already works with law enforcement regularly and would help develop the FBI’s capability to decrypt technology itself, but won’t open ‘back doors’ to its iPhones due to the security risk that would pose to all users. . . . What the FBI wants, Hess said, is ‘that when we present an order, signed by an independent federal judge, that (tech companies) comply with that order and provide us with the information in readable form.’ How they do that is up to them, she said.”

“The leaders of the Senate Intelligence Committee have introduced a bill that would mandate those receiving a court order in an encryption case to provide “intelligible information or data” or the “technical means to get it” — in other words, a key to unlock secured data.  “I call it a ‘follow the rule of law bill,’ because that’s what it does: It says nobody’s exempt from a court order issued by a judge on the bench,’ said Committee Chairman Richard Burr, a North Carolina Republican. The top Democrat on the committee, California’s Dianne Feinstein, is a co-sponsor.”

Senate Bill Introduced

Here are a few excerpts from the proposed Senate Bill:

(1) GENERAL. Notwithstanding any other provision of law and except as provided in paragraph 7 (2), a covered entity that receives a court order from a government for information or data shall —

(A) provide such information or data to such government in an intelligible format; or

(B) provide such technical assistance as is necessary to obtain such information or data in an intelligible format or to achieve the purpose of the court order.

(2) SCOPE OF REQUIREMENT. A covered entity that receives a court order referred to in par graph (1)(A) shall be responsible only for providing data in an intelligible format if such data has been made unintelligible by a feature, product, or service owned, controlled, created, or provided, by the covered entity or by a third party on behalf of the covered entity.

(3) COMPENSATION FOR TECHNICAL ASSISTANCE. . . .

(b) DESIGN LIMITATIONS. Nothing in this Act shall be construed to authorize any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity.

(4) DEFINITIONS . . . .

Non-Terrorist Crimes & Demands for Cell-Phone Access

Upcoming: Newseum Institute Moot Court Event Read More

0

FAN (First Amendment News, Special Series) Newseum Institute to Host Event on Cell Phone Privacy vs National Security Controversy

images

Starting today and continuing through mid-June, I will post a special series of occasional blogs related to the Apple iPhone national security controversy and the ongoing debate surrounding it, even after the FBI gained access to the phone used by the terrorist gunman in the December shooting in San Bernardino, California.

Gene Policinski

Gene Policinski

This special series is done in conjunction with the Newseum Institute and a major program the Institute will host on June 15, 2016 in Washington, D.C.

I am pleased to be working with Gene Policinski (the chief operating officer of the Newseum Institute) and Nan Mooney (a D.C. lawyer and former law clerk to Chief Judge James Baker of the U.S. Court of Appeals for the Armed Forces) in organizing the event.

The June 15th event will be a moot court with seven Supreme Court Justices and two counsel for each side. The focus will be on the First Amendment issues raised in the case. (See below re links to the relevant legal documents).

→ Save the Date: Wednesday, June 15, 2016 @ 2:00 p.m., Newseum, Washington, D.C. (more info forthcoming).

The Apple-FBI clash was the first significant skirmish — and probably not much more than that — of the Digital Age conflicts we’re going to see in this century around First Amendment freedoms, privacy, data aggregation and use, and even the extent of religious liberty. As much as the eventual outcome, we need to get the tone right, from the start — freedom over simple fear. –Gene Policinski

Newseum Institute Moot Court Event

It remains a priority for the government to ensure that law enforcement can obtain crucial digital information to protect national security and public safety, either with cooperation from relevant parties, or through the court system when cooperation fails.Melanie Newman (spokeswoman for Justice Department, 3-28-16)

As of this date, the following people have kindly agreed to participate as Justices for a seven-member Court:

The following two lawyers have kindly agreed to serve as the counsel (2 of 4) who will argue the matter:

→ Two additional Counsel to be selected.  

Nan Mooney and I will say more about both the controversy and the upcoming event in the weeks ahead in a series of special editions of FAN. Meanwhile, below is some relevant information, which will be updated regularly.

Apple vs FBI Director James Comey

President Obama’s Statement

Congressional Hearing

Documents

Screen Shot 2016-03-17 at 10.46.11 PM

Last Court Hearing: 22 March 2016, before Judge Sheri Pym

Podcast

Video

News Stories & Op-Eds

lockediphone5c

  1. Pierre Thomas & Mike Levine, “How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple,” ABC News, March 29, 2016
  2. Bruce Schneier, “Your iPhone just got less secure. Blame the FBI,” Washington Post, March 29, 2016
  3. Katie Benner & Eric Lichtblau, “U.S. Says It Has Unlocked Phone Without Help From Apple,” New York Times, March 8, 2016
  4. John Markoff, Katie Benner & Brian Chen, “Apple Encryption Engineers, if Ordered to Unlock iPhone, Might Resist,” New York Times, March 17, 2016
  5. Jesse Jackson, “Apple Is on the Side of Civil Rights,” Time, March 17, 2016
  6. Katie Benner & Eric Lichtblau, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” New York Times, March 15, 2016
  7. Kim Zetter, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” Wired, March 15, 2016
  8. Alina Selyukh, “Apple On FBI iPhone Request: ‘The Founders Would Be Appalled,‘” NPR, March 15, 2016
  9. Howard Mintz, “Apple takes last shot at FBI’s case in iPhone battle,” San Jose Mercury News, March 15, 2016
  10. Russell Brandom & Colin Lecher, “Apple says the Justice Department is using the law as an ‘all-powerful magic wand‘,” The Verge, March 15, 2016
  11. Adam Segal & Alex Grigsby, “3 ways to break the Apple-FBI encryption deadlock,” Washington Post, March 14, 2016
  12. Seung Lee, “Former White House Official Says NSA Could Have Cracked Apple-FBI iPhone Already,” Newsweek, March 14, 2016
  13. Tim Bajarin, “The FBI’s Fight With Apple Could Backfire,” PC, March 14, 2016
  14. Alina Selyukh, “U.S. Attorneys Respond To Apple In Court, Call Privacy Concerns ‘A Diversion’,” NPR, March 10, 2016
  15. Dan Levine, “San Bernardino victims to oppose Apple on iPhone encryption,” Reuters, Feb. 22, 2016
  16. Apple, The FBI And iPhone Encryption: A Look At What’s At Stake,” NPR, Feb. 17, 2016
stairway-to-heaven-1319562-m-720x340
0

FAN 102 (First Amendment News) Len Niehoff on Hulk Hogan’s $140.1M Award Against Gawker

The magnitude of Hogan’s $100 million damage claim could have a serious chilling effect on all media who report on public figures and their lifestyles. — Len Niehoff (3-16-16)

Will there be a chilling effect on journalists? I hope not. I guess editors will have to address that. — Erwin Chemerinsky (3-21-16)

Prof. Len Niehoff

Prof. Len Niehoff

Recently, a Florida jury rendered a $115 million verdict (YouTube video here) against Gawker, this in connection with a 2012 posting  of a snippet of a video of Hulk Hogan (Terry G. Bollea) having sex with a friend’s wife. Subsequently, that jury awarded an additional $25.1 million in punitive damages. Gawker has said it will appeal.

The controversy arouse when Gawker posted a 13-year old secretly recorded sex video involving Mr. Hogan. He sued and prevailed on a claims of  invasion of privacy, intentional infliction of emotional distress, and economic harm.

Given the verdict, I invited Len Niehoff (professor at the University of Michigan Law School and of counsel at Honigman Miller Schwartz & Cohn) to comment on the Gawker $140.1 million dollar award and the First Amendment issues raised by it.

* * * * 

Last Friday, a Florida jury awarded Hulk Hogan $115 million in damages against Gawker based upon its publication of a brief and grainy videotape of the former professional wrestler having sex. That verdict exceeded the $100 million requested by Hogan and was purportedly compensatory, although the punitive message was tough to miss. A few days later the jury added $25 million more in formally punitive damages, which seems redundantly oppressive if not, so to speak, orgiastic.

The extravagance of the verdict is a problem unto itself. The evidence presented at trial seems wholly inadequate to yield such a number. And such outsized verdicts raise grave concerns when they come in speech cases. As the Supreme Court observed in New York Times, Co. v. Sullivan (1964), substantial damage awards can chill speech just as effectively as a criminal prosecution, casting a “pall of fear and timidity” over free expression. In Sullivan, the Court observed that the libel damage award at issue there was 100 times greater than the penalty imposed under the much-maligned Sedition Act. The verdict in question here, based on true speech, is about 28,000 times greater.

Apart from damages, the finding of liability is itself worrisome. In Snyder v. Phelps (2011), the Supreme Court held that the First Amendment barred invasion of privacy claims brought by a significantly more sympathetic plaintiff than Hulk Hogan. There, the father of a deceased soldier sued the Westboro Baptist Church for picketing and displaying offensive signs near his son’s funeral. The plaintiff advanced a variety of claims, including invasion of privacy. The jury awarded millions of dollars in damages to the plaintiff but the Supreme Court reversed, at various points in its opinion framing the relevant inquiry in two different ways.

Hulk Hogan

Hulk Hogan

In one portion of its opinion, the Court suggests that the test is whether the speech was of “only private concern.” The Court cited a case involving an individual’s credit report, which had been sent to a limited number of subscribers who were bound not to disseminate it. The Court noted that the publication in question there was of interest “solely” to the speaker and a specified audience.

If this is the test then Gawker clearly prevails. Prior to Gawker’s publication of the tape, Hulk Hogan had widely disseminated stories about his sexual exploits and they had become a matter of public discussion. These facts make it difficult (if not impossible) to argue that Hogan’s sexual escapades were “only” or “solely” of interest to him and a small collection of intimates.

In another portion of the opinion, the Court suggests that the test is whether the speech “can be fairly considered as relating to any matter of political, social, or other concern to the community.” The Court stressed that this is a highly contextual inquiry and that the “inappropriate or controversial character” of the speech is “irrelevant.”

 Hogan’s case presents a closer question under this standard but it is important to understand why. Let’s assume that Gawker had published a story describing Hogan’s sexual activities without showing the tape. Under those circumstances, it seems clear that Gawker’s conduct would pass the test. Gawker would simply have conveyed facts that had become a matter of public interest and on which a number of media entities had reported—and continue to report. Gawker would have done what the media have done for years: talk about the noteworthy sex life of a public figure.

What makes this case a closer one is Gawker’s decision to show the tape itself. This is almost certainly what outraged the jury. And it is not an irrelevant consideration—indeed, in Snyder the Supreme Court suggests that the “form” of the speech can matter. But should the distinction between describing and showing make a difference in this particular case? I am skeptical, for two primary reasons.

Last week’s jury verdict awarding Hulk Hogan $115 million had onlookers predicting the death of Gawker Media . . . . — Kaja Sadowski, USA Today, March 21, 2016

First, this distinction carries with it the risk that we will punish speech because it was conveyed in a particularly powerful form. The jury that was outraged over the tape might have greeted with relative indifference a Gawker report describing the same events. The video evokes a stronger, and potentially unreasoned, response. As media law scholar Jane Kirtley noted in a recent New York Times op-ed., the jury may well have thought to itself: “That could be my daughter, or my grandson. Or me.” But, of course, the jury would not want Gawker to report descriptively on those things, either. In other words, we need to ensure that uniquely compelling speech does not receive less protection because of its capacity to prompt us to ask the wrong questions.

Nick Denton (owner of Gawker Media)

Nick Denton (owner of Gawker Media)

Second, where form does seem to make a difference that difference will often lie in substantially greater and more invasive detail. Say, hypothetically, that a presidential candidate who has been described as having small hands wants to dispel any implications about the size of his penis. The candidate publicly offers a vague “guarantee” that there is “no problem” in this respect. Reporting on these events certainly raises no privacy concern. But we would likely feel differently about the broadcast of a purloined security video that showed the candidate in a restroom and provided definitive data.

In contrast, consider the hypothetical author of a memoir that offers detailed descriptions of his or her many sexual encounters. A report on these events would, again, raise no privacy concerns. But, here, we might also conclude that a videotape of the same events did not constitute an invasion of privacy, given the level of specificity that the author already shared with us. An argument can be made that the Hogan case is much closer to this hypothetical than to the prior one.

What’s next? The damage award will likely be reduced and a settlement may emerge. Or, perhaps, an appellate court will reverse. There is, after all, a compelling argument that Hogan cannot object to further publicity about his time in the sexual limelight having, well, “thrust himself” there.

* *  *

A top Gawker Media executive [Heather Dietrick, Gawker Media’s president and general counsel] says the company expects a jury’s multi-million dollar award in a sex video case will be overturned by an appeals court. — ABC News, March 21, 2016

* *  *

Commentaries 

Georgetown Appellate Litigation Clinic Files Brief in 1-A Retaliation Case  Read More

2

Irresistible Surveillance?

Bernard Harcourt’s Exposed: Desire and Disobedience in the Digital Age offers many intriguing insights into how power circulates in contemporary society.  The book’s central contribution, as I see it, is to complicate the standard model of surveillance by introducing the surveilled’s agency into the picture.  Exposed highlights the extent to which ordinary people are complicit in regimes of data-monitoring and data-mining that damage their individual personhood and the democratic system.  Millions upon millions of “digital subjects,” Harcourt explains, have come to embrace forms of exposure that commoditize their own privacy.  Sometimes people do this because they want more convenience when they navigate capitalist culture or government bureaucracies.  Or because they want better book recommendations from Amazon.  Other times, people wish to see and be seen online—increasingly feel they need to be seen online—in order to lead successful social and professional lives.

So complicit are we in the erosion of our collective privacy, Harcourt suggests, that any theory of the “surveillance state” or the “surveillance industrial complex” that fails to account for these decentralized dynamics of exhibition, spectacle, voyeurism, and play will misdiagnose our situation.  Harcourt aligns himself at times with some of the most provocative critics of intelligence agencies like the NSA and companies like Facebook.  Yet the emphasis he places on personal desire and participatory disclosure belies any Manichean notion of rogue institutions preying upon ignorant citizens.  His diagnosis of how we’ve lost our privacy is more complex, ethically and practically, in that it forces attention on the ways in which our current situation is jointly created by bottom-up processes of self-exposure as well as by top-down processes of supervision and control.

Thus, when Harcourt writes in the introduction that “[t]here is no conspiracy here, nothing untoward,” what might seem like a throwaway line is instead an important descriptive and normative position he is staking out about the nature of the surveillance problem.  Exposed calls on critics of digital surveillance to adopt a broader analytic lens and a more nuanced understanding of causation, power, and responsibility.  Harcourt in this way opens up fruitful lines of inquiry while also, I think, opening himself up to the charge of victim-blaming insofar as he minimizes the social and technological forces that limit people’s capacity to change their digital circumstances.

The place of desire in “the expository society,” Harcourt shows, requires rethinking of our metaphors for surveillance, discipline, and loss of privacy.  Exposed unfolds as a series of investigations into the images and tropes we conventionally rely on to crystallize the nature of the threat we face: Big Brother, the Panopticon, the Surveillance State, and so forth.  In each case, Harcourt provides an erudite and sympathetic treatment of the ways in which these metaphors speak to our predicament.  Yet in each case, he finds them ultimately wanting.  For instance, after the Snowden disclosures began to garner headlines, many turned to George Orwell’s novel 1984 to help make sense of the NSA’s activities.  Book sales skyrocketed.  Harcourt, however, finds the Big Brother metaphor to be misleading in critical respects.  As he reminds us, Big Brother sought to wear down the citizens of Oceania, neutralize their passions, fill them with hate.  “Today, by contrast, everything functions by means of ‘likes,’ ‘shares,’ ‘favorites,’ ‘friending,’ and ‘following.’  The drab blue uniform and grim gray walls in 1984 have been replaced by the iPhone 5C in all its radiant colors . . . .”  We are in a new condition, a new paradigm, and we need a new language to negotiate it.

Harcourt then considers a metaphor of his own devising: the “mirrored glass pavilion.”  This metaphor is meant to evoke a sleek, disorienting, commercialized space in which we render ourselves exposed to the gaze of others and also, at the same time, to ourselves.  But Harcourt isn’t quite content to rest with this metaphor either.  He introduces the mirrored glass pavilion, examines it, makes a case for it, and keeps moving—trying out metaphors like “steel mesh” and “data doubles” and (my favorite) “a large oligopolistic octopus that is enveloping the world,” all within the context of the master metaphor of an expository society.  Metaphors, it seems, are indispensable if imperfect tools for unraveling the paradoxes of digital life.

The result is a restless, searching quality to the analysis.  Exposed is constantly introducing new anecdotes, examples, paradigms, and perspectives, in the manner of a guided tour.  Harcourt is clearly deeply unsettled by the digital age we have entered.  Part of the appeal of the book is that he is willing to leave many of his assessments unsettled too, to synthesize a vast range of developments without simplifying or prophesizing.

Another aspect of Exposed that enhances its effect is the prose style.  Now, I wouldn’t say that Harcourt’s Foucault-fueled writing has ever suffered from a lack of flair.  But in this work, Harcourt has gone further and become a formal innovator.  He has developed a prose style that uncannily mimics the experience of the expository society, the phenomenology of the digital subject.

Throughout the book, when describing the allure of new technologies that would rob us of our privacy and personhood, the prose shifts into a different register.  The reader is suddenly greeted with quick staccato passages, with acronyms and brand names thrown together in a dizzying succession of catalogs and subordinate clauses.  In these passages, Harcourt models for us the implicit bargain offered by the mirrored glass pavilion—inviting us to suspend critical judgment, to lose ourselves, as we get wrapped up in the sheer visceral excitement, the mad frenzy, of digital consumer culture.

Right from the book’s first sentence, we confront this mimetic style:

Every keystroke, each mouse click, every touch of the screen, card swipe, Google search, Amazon purchase, Instagram, ‘like,’ tweet, scan—in short, everything we do in our new digital age can be recorded, stored, and monitored.  Every routine act on our iPads and tablets, on our laptops, notebooks, and Kindles, office PCs and smart-phones, every transaction with our debit card, gym pass, E-ZPass, bus pass, and loyalty cards can be archived, data-mined, and traced back to us.

Other sentences deploy a similar rhetorical strategy in a more positive key, describing how we now “‘like,’ we ‘share,’ we ‘favorite.’ We ‘follow.’ We ‘connect.’ We get ‘LinkedIn”—how “[e]verything today is organized around friending, clicking, retweeting, and reposting.”

There is a visceral pleasure to be had from abandoning oneself to the hyper-stimulation, the sensory overload, of passages like these.  Which is precisely the point.  For that pleasure underwrites our own ubiquitous surveillance and the mortification of self.  That pleasure is our undoing.  More than anything else, in Harcourt’s telling, it is the constant gratifications afforded by technologies of surveillance that have “enslaved us, exposed us, and ensnared us in this digital shell as hard as steel.”

*  *  *

I hope these brief comments have conveyed some of what I found so stimulating in this remarkable book.  Always imaginative and often impressionistic, Exposed is hazy on a number of significant matters.  In the hope of facilitating conversation, I will close by noting a few.

First, what are the distributional dimensions of the privacy crisis that we face?  The implicit digital subject of Exposed seems to be a highly educated, affluent type—someone who would write a blog post, wear an Apple Watch, buy books on Amazon.  There may well be millions of people like this; I don’t mean to suggest any narcissism in the book’s critical gaze.  I wonder, though, how the privacy pitfalls chronicled in Exposed relate to more old-fashioned forms of observation and exploitation that continue to afflict marginalized populations and that Harcourt has trenchantly critiqued in other work.

Second, what about all the purported benefits of digital technology, Big Data, and the like?  Some commentators, as Harcourt notes in passing, have begun to argue that panoptic surveillance, at least under the right conditions, can facilitate not only certain kinds of efficiency and security but also values such as democratic engagement and human freedom that Harcourt is keen to promote.  I share Harcourt’s skepticism about these arguments, but if they are wrong then we need to know why they are wrong, and in particular whether they are irredeemably mistaken or whether they might instead point us toward useful regulatory reforms.

And lastly, what would dissent look like in this realm?  The final, forward-looking chapter of Exposed is strikingly short, only four pages long.  Harcourt exhorts the reader to fight back through “digital resistance” and “political disobedience.”  But remember, there is no conspiracy here, nothing untoward.  Rather, there is a massively distributed and partially emergent system of surveillance.  And this system generates enormous perceived rewards, not just for those at the top but for countless individuals throughout society.  It is something of a puzzle, then, what would motivate the sort of self-abnegating resistance that Harcourt calls for—resistance that must be directed, in the first instance, toward our own compulsive habits and consumptive appetites.  How would that sort of resistance develop, in the teeth of our own desires, and how could it surmount collective action barriers?

These are just a few of the urgent questions that Exposed helps bring into focus.

*  *  *

David Pozen is an associate professor at Columbia Law School.

0

How CalECPA Improves on its Federal Namesake

Last week, Governor Brown signed the landmark California Electronic Communications Privacy Act[1] (CalECPA) into law and updated California privacy law for modern communications. Compared to ECPA, CalECPA requires warrants, which are more restricted, for more investigations; provides more notice to targets; and furnishes as a remedy both court-ordered data deletion and statutory suppression.  Moreover, CalECPA’s approach is comprehensive and uniform, eschewing the often irrational distinctions that have made ECPA one of the most confusing and under-protective privacy statutes in the Internet era.

Extended Scope, Enhanced Protections, and Simplified Provisions

CalECPA regulates investigative methods that ECPA did not anticipate. Under CalECPA, government entities in California must obtain a warrant based on probable cause before they may access electronic communications contents and metadata from service providers or from devices.  ECPA makes no mention of device-stored data, even though law enforcement agents increasingly use StingRays to obtain information directly from cell phones. CalECPA subjects such techniques to its warrant requirement. While the Supreme Court’s recent decision in United States v. Riley required that agents either obtain a warrant or rely on an exception to the warrant requirement to search a cell phone incident to arrest, CalECPA requires a warrant for physical access to any device, not just a cell phone, which “stores, generates, or transmits electronic information in electronic form.” CalECPA clearly defines the exceptions to the warrant requirement by specifying what counts as an emergency, who can give consent to the search of a device, and related questions.

ECPA’s 1986-drafted text only arguably covers the compelled disclosure of location data stored by a service provider, and does not clearly require a warrant for such investigations. CalECPA explicitly includes location data in the “electronic communication information” that is subject to the warrant requirement when a government entity accesses it from either a device or a service provider (broadly defined).  ECPA makes no mention of location data gathered in real-time or prospectively, but CalECPA requires a warrant both for those investigations and for stored data investigations. Whenever a government entity compels the “the production of or access to” location information, including GPS data, from a service provider or from a device, CalECPA requires a warrant.

Read More

Posner
8

The Complete Posner on Posner Series

The Posner on Posner series began on November 24, 2014 and ended with the Afterword on January 5, 2015. Below is a hyperlinked list of all the posts.

 Table of Contents

  1. The Maverick – A Biographical Sketch of Judge Richard Posner: Part I
  1. The Maverick – A Biographical Sketch of Judge Richard Posner: Part II, The Will to Greatness
  1. The Man Behind the Robes — A Q & A with Richard Posner
  1. The Judge & Company – Questions for Judge Posner from Judges, Law Professors & a Journalist
  1. On Legal Education & Legal Scholarship — More questions for Judge Posner
  1. On Free Expression & the First Amendment — More questions for Judge Posner
  1. On Privacy, Free Speech, & Related Matters – Richard Posner vs David Cole & Others
  1. On Judicial Reputation: More questions for Judge Posner
  1. Posner on Same-Sex Marriage – Then & Now
  1. Posner on Case Workloads & Making Judges Work Harder
  1. The Promethean Posner – An Interview with the Judge’s Biographer
  1. Afterword: Posner at 75 – “It’s My Job”

→ Forthcoming: Richard Posner (Oxford University Press, 2016) by William Domnarski.

Posner
8

On Privacy, Free Speech, & Related Matters – Richard Posner vs David Cole & Others

I’m exaggerating a little, but I think privacy is primarily wanted by people because they want to conceal information to fool others. Richard Posner

Privacy is overratedRichard Posner (2013)

 Much of what passes for the name of privacy is really just trying to conceal the disreputable parts of your conduct. Privacy is mainly about trying to improve your social and business opportunities by concealing the sorts of bad activities that would cause other people not to want to deal with you.Richard Posner (2014)

This is the seventh installment in the “Posner on Posner” series of posts on Seventh Circuit Judge Richard Posner. The first installment can be found here, the second here, the third here, the fourth here, the fifth here, and the sixth one here.

Privacy has been on Richard Posner’s mind for more than three-and-a-half decades. His views, as evidenced by the epigraph quotes above, have sparked debate in a variety of quarters, both academic and policy. In some ways those views seem oddly consistent with his persona – on the one hand, he is a very public man as revealed by his many writings, while on the other hand, he is a very private man about whom we know little of his life outside of the law save for a New Yorker piece on him thirteen years ago.

On the scholarly side of the privacy divide, his writings include:

  1. The Right of Privacy,” 12 Georgia Law Review 393 (1978)
  2. Privacy, Secrecy, and Reputation,” 28 Buffalo Law Review 1 (1979)
  3. The Uncertain Protection of Privacy by the Supreme Court,” 1979 Supreme Court Review 173
  4. The Economics of Privacy,” 71 The American Economic Review 405 (1981)
  5. Privacy,” Big Think (video clip, nd)
  6. Privacy is Overrated,” New York Daily News, April 28, 2014

For a sampling of Judge Posner’s opinion on privacy, go here (and search Privacy)

(Note: Some links will only open in Firefox or Chrome.)

_____________________

Privacy – “What’s the big deal?”

Privacy interests should really have very little weight when you’re talking about national security. The world is in an extremely turbulent state – very dangerous. — Richard Posner (2014)

Recently, Georgetown Law Center held a conference entitled “Cybercrime 2020: The Future of Online Crime and Investigations” (full C-SPAN video here). In the course of that event, Judge Posner joined with others in government, private industry, and in the legal academy to discuss privacy, the Fourth Amendment, and free speech, among other things. A portion of the exchange between Judge Posner and Georgetown law professor David Cole was captured on video.

Judge Richard Posner

Judge Richard Posner

Scene: The Judge sitting in his office, speaking into a video conference camera — As he rubbed his fingers across the page and looked down, Posner began: “I was thinking, listening to Professor Cole, what exactly is the information that he’s worried about?” Posner paused, as if to setup his next point: “I have a cell phone – iPhone 6 – so if someone drained my cell phone, they would find a picture of my cat [laughter], some phone numbers, some e-mail addresses, some e-mail texts – so what’s the big deal?”

He then glanced up from the text he appeared to be reading and spoke with a grin: “Other people must have really exciting stuff. [laughter] Could they narrate their adulteries or something like that?” [laughter] He then waved his hands in the air before posing a question to the Georgetown Professor.

“What is it that you’re worrying about?” Posner asked as if truly puzzled.

At that point, Cole leaned into his microphone and looked up at the video screen bearing the Judge’s image next to case reports on his left and the American flag on his right.

Cole: “That’s a great question, Judge Posner.”

Professor Cole continued, adding his own humor to the mix: “And I, like you, have only pictures of cats on my phone. [laughter] And I’m not worried about anything from myself, but I’m worried for others.”

On a more substantive note, Cole added: “Your question, which goes back to your original statement, . . . value[s] . . . privacy unless you have something to hide. That is a very, very shortsighted way of thinking about the value [of privacy]. I agree with Michael Dreeben: Privacy is critical to a democracy; it is critical to political freedom; [and] it is critical to intimacy.”

The sex video hypothetical

And then with a sparkle in his spectacled eye, Cole stated: “Your question brings to mind a cartoon that was in the New Yorker, just in the last couple of issues, where a couple is sitting in bed and they have video surveillance cameras over each one of them trained down on the bed [Cole holds his hands above his head to illustrate the peering cameras]. And the wife says to the husband: ‘What are you worried about if you’ve got nothing to hide, you’ve got nothing to fear.’”

Using the cartoon as his conceptual springboard, Cole moved on to his main point: “It seems to me that all of us, whether we are engaged in entirely cat-loving behavior, or whether we are going to psychiatrists, or abortion providers, or rape crises centers, or Alcoholics Anonymous, or have an affair – all of us have something to hide. Even if you don’t have anything to hide, if you live a life that could be entirely transparent to the rest of the world, I still think the value of that life would be significantly diminished if it had to be transparent.”

Without missing a beat, Cole circled back to his video theme: “Again you could say, ‘if you’ve got nothing to hide, and you’re not engaged in criminal activity, let’s put video cameras in every person’s bedroom. And let’s just record the video, 24/7, in their bedroom. And we won’t look at it until we have reason to look at it. You shouldn’t be concerned because . . .’”

At this point, Posner interrupted: “Look, that’s a silly argument.”

Cole: “But it’s based on a New Yorker cartoon.”

The Judge was a tad miffed; he waved his right hand up and down in a dismissive way: “The sex video, that’s silly!Waving his index finger to emphasize his point, he added: “What you should be saying, [what] you should be worried about [are] the types of revelation[s] of private conduct [that] discourage people from doing constructive things. You mentioned Alcoholics Anonymous . . .”

Cole: “I find sex to be a constructive thing.”

Obviously frustrated, Posner raised his palms up high in protest: “Let me finish, will you please?”

Cole: “Sure.”

Posner: “Look, that was a good example, right? Because you can have a person who has an alcohol problem, and so he goes to Alcoholics Anonymous, but he doesn’t want this to be known. If he can’t protect that secret,” Posner continued while pointing, “then he’s not going to go to Alcoholics Anonymous. That’s gonna be bad. That’s the sort of thing you should be concerned about rather than with sex videos. . . . [The Alcoholics Anonymous example] is a good example of the kind of privacy that should be protected.”

David Cole

Professor David Cole

Privacy & Politics 

Meanwhile, the audience listened and watched on with its attention now fixed on the Georgetown professor.

Cole: “Well, let me give you an example of sex privacy. I think we all have an interest in keeping our sex lives private. That’s why we close doors into our bedroom, etc. I think that’s a legitimate interest, and it’s a legitimate concern. And it’s not because you have something wrong you want to hide, but because intimacy requires privacy, number one. And number two: think about the government’s use of sex information with respect to Dr. Martin Luther King. They investigated him, intruded on his privacy by bugging his hotel rooms to learn [about his] affair, and then sought to use that – and the threat of disclosing that affair – to change his behavior. Why? Because he was an active, political, dissident fighting for justice.”

“We have a history of that,” he added. “Our country has a history of that; most countries have a history of that; and that’s another reason the government will use information – that doesn’t necessarily concern [it] – to target people who [it is] concerned about . . . – not just because of their alcohol problem [or] not just because of their sexual proclivities – but because they have political views and political ideas that the government doesn’t approve of.”

At this point the moderator invited the Judge to respond.

Posner: “What happened to cell phones? Do you have sex photos on your cell phones?”

Cole: “I imagine if Dr. Martin Luther King was having an affair in 2014, as opposed to the 1960s, his cell phone, his smart phone, would have quite a bit of evidence that would lead the government to that affair. He’d have call logs; he might have texts; he might have e-mails – all of that would be on the phone.”

The discussion then moved onto the other panelists.

Afterwards, and writing on the Volokh Conspiracy blog, Professor Orin Kerr, who was one of the participants in the conference, summed up his views of the exchange this way:

“I score this Cole 1, Posner 0.”

The First Amendment — Enter Glenn Greenwald Read More

5

What’s ailing the right to be forgotten (and some thoughts on how to fix it)

The European Court of Justice’s recent “right to be forgotten” ruling is going through growing pains.  “A politician, a pedophile and a would-be killer are among the people who have already asked Google to remove links to information about their pasts.”  Add to that list former Merill Lynch Executive Stan O’Neal, who requested that Google hide links to an unflattering BBC News articles about him.

Screen Shot 2014-07-09 at 9.21.19 AMAll told, Google “has removed tens of thousands of links—possibly more than 100,000—from its European search results,” encompassing removal requests from 91,000 individuals (apparently about 50% of all requests are granted).  The company has been pulled into discussions with EU regulators about its implementation of the rules, with one regulator opining that the current system “undermines the right to be forgotten.”

The list of questions EU officials recently sent Google suggests they are more or less in the dark about the way providers are applying the ECJ’s ruling.  Meanwhile, European companies like forget.me (pictured) are looking to reap a profit from the uncertainty surrounding the application of these new rules.  The quote at the end of the Times article sums up the current state of affairs:

“No one really knows what the criteria is,” he said, in reference to Google’s response to people’s online requests. “So far, we’re getting a lot of noes. It’s a complete no man’s land.”

What (if anything) went wrong? As I’ll argue* below, a major flaw in the current implementation is that it puts the initial adjudication of right to be forgotten decisions in the hands of search engine providers, rather than representatives of the public interest.  This process leads to a lack of transparency and potential conflicts of interest in implementing what may otherwise be sound policy.

The EU could address these problems by reforming the current procedures to limit search engine providers’ discretion in day-to-day right to be forgotten determinations.  Inspiration for such an alternative can be found in other areas of law regulating the conduct of third party service providers, including the procedures for takedown of copyright-infringing content under the DMCA and those governing law enforcement requests for online emails.

I’ll get into more detail about the current implementation of the right to be forgotten and some possible alternatives after the jump.

Read More

2

Need an alternative to the third party doctrine? Look backwards, not forward. (Part I)

500px-Folder_home.svg

In light of the renewed discussion on the future of the third party doctrine on this blog and elsewhere (much of it attributable to Riley), I’d like to focus my next couple of posts on the oft-criticized rule, with the aim of exploring a few questions that will hopefully be interesting* to readers. For the purpose of these posts, I’m assuming readers are familiar with the third party doctrine and the arguments for and against it.

I’ll start with the following question: Let’s assume the Supreme Court decides to scale back the third party doctrine.  Where in the Court’s Fourth Amendment jurisprudence should the Justices look for an alternative approach?  I think this is an interesting and important question in light of the serious debate, both in academia and on the Supreme Court, about the third party doctrine’s effect on privacy in the information age.

One answer, which may represent the conventional wisdom, is that there simply is nothing in the Supreme Court’s existing precedent that supports a departure from the Court’s all or nothing approach to Fourth Amendment rights in Smith and Miller.  According to this answer, the Court’s only choice if it wishes to “reconsider” the third party doctrine is to create new, technology specific rules that address the problems of the day.  (I’ve argued elsewhere that existing Fourth Amendment doctrine doesn’t bind the Court to rigid applications of its existing rules in the face of new technologies.)

A closer look at the Court’s Fourth Amendment jurisprudence suggests another option, however. The Supreme Court has not applied the underlying rationale from its third party doctrine cases to all forms of government intrusion.  Indeed, for almost a century the Supreme Court has been willing to depart from the all or nothing approach in another Fourth Amendment context: government searches of dwellings and homes.  As I’ll discuss below, the Supreme Court has used various tools—including the implied license rule in last year’s Jardines, the standard of “common understandings,” and the scope of consent rules in co-habitant cases—to allow homeowners, cohabitants, tenants, hotel-guests, overnight guests, and the like maintain Fourth Amendment rights against the government even though they have given third parties access to the same space.

In other words, it is both common sense and black letter law that a person can provide third parties access to his home for a particular purpose without losing all Fourth Amendment rights against government intrusion. Letting the landlord or the maid into your home for a limited purpose doesn’t necessarily give the police a license to enter without a warrant—even if the police persuade the landlord or the maid to let them in. Yet the Court has abandoned that type of nuance in the context of informational privacy, holding that sharing information with a third party means forgoing all Fourth Amendment rights against government access to that information (a principle that has eloquently been described as the “secrecy paradigm”). As many have noted, this rule has had a corrosive effect on Fourth Amendment rights in a world where sensitive information is regularly shared with third parties as a matter of course.

Why has the Court applied such a nuanced approach to Fourth Amendment rights when it comes to real property and the home, but not when it comes to informational privacy?  And have changes in technology undermined some of the rationale justifying this divergence? These are questions I’ll explore further in Part II of this post; in the meantime I’d love to hear what readers think about them. I’ll spend the rest of this post providing some additional background on the Court’s approach to privacy in the context of real property searches.

More after the jump.

Read More

0

The data retention judgment, the Irish Facebook case, and the future of EU data transfer regulation

On April 8 the Court of Justice of the European Union (CJEU) announced its judgment in the case C-293/12 and C-594/12 Digital Rights Ireland. Based on EU fundamental rights law, the Court invalidated the EU Data Retention Directive, which obliged telecommunications service providers and Internet service providers in the EU to retain telecommunications metadata and make it available to European law enforcement authorities under certain circumstances. The case illustrates both the key role that the EU Charter of Fundamental Rights plays in EU data protection law, and the CJEU’s seeming disinterest in the impact of its recent data protection rulings on other fundamental rights. In addition, the recent referral to the CJEU by an Irish court of a case involving data transfers by Facebook under the EU-US Safe Harbor holds the potential to further tighten EU rules for data transfers, and to reduce the possibility of EU-wide harmonization in this area.

In considering the implications of Digital Rights Ireland for the regulation of international data transfers, I would like to focus on a passage occurring towards the end of the judgment, where the Court criticizes the Data Retention Directive as follows (paragraph 68):

“[I]t should be added that that directive does not require the data in question to be retained within the European Union, with the result that it cannot be held that the control, explicitly required by Article 8(3) of the Charter, by an independent authority of compliance with the requirements of protection and security, as referred to in the two previous paragraphs, is fully ensured. Such a control, carried out on the basis of EU law, is an essential component of the protection of individuals with regard to the processing of personal data…”

This statement caught many observers by surprise. The CJEU is famous for the concise and self-referential style of its opinions, and the case revolved around the legality of the Directive in general, not around whether data stored under it could be transferred outside the EU. This issue was also not raised in the submission of the case to the Court, and first surfaced in the advisory opinion issued by one of the Court’s advocates-general prior to the judgment (see paragraph 78 of that Opinion).

In US constitutional law, the question “does the constitution follow the flag?” generally arises in the context of whether the Fourth Amendment to the US Constitution applies to government activity overseas (e.g., when US law enforcement abducts a fugitive abroad and brings him back to the US). In the context discussed here, the question is rather whether EU data protection law applies to personal data as they are transferred outside the EU, i.e., “whether the EU flag follows EU data”. As I explained in my book on the regulation of transborder data flows that was published last year by Oxford University Press, in many cases EU data protection law remains applicable to personal data transferred to other regions. For example, in introducing its proposed reform of EU data protection law, the European Commission stated in 2012 that one of its key purposes is to “ensure a level of protection for data transferred out of the EU similar to that within the EU”.

EU data protection law is based on constitutional provisions protecting fundamental rights (e.g., Article 8 of the EU Charter of Fundamental Rights), and the CJEU has emphasized in cases involving the independence of the data protection authorities (DPAs) in Austria, Germany, and Hungary that control of data processing by an independent DPA is an essential element of the fundamental right to data protection (without ever discussing independent supervision in the context of data processing outside the EU). In light of those previous cases, the logical consequence of the Court’s statement in Digital Rights Ireland would seem to be that fundamental rights law requires oversight of data processing by the DPAs also with regard to the data of EU individuals that are transferred to other regions.

This conclusion raises a number of questions. For example, how can it be reconciled with the fact that the enforcement jurisdiction of the DPAs ends at the borders of their respective EU Member States (see Article 28 of the EU Data Protection Directive 95/46)? If supervision by the EU DPAs extends already by operation of law to the storage of EU data in other regions, then why do certain EU legal mechanisms in addition force the parties to data transfers to explicitly accept the extraterritorial regulatory authority of the DPAs (e.g., Clause 5(e) of the EU standard contractual clauses of 2010)? And how does the Court’s statement fit with its 2003 Lindqvist judgment, where it held that EU data protection law should not be interpreted to apply to the entire Internet (see paragraph 69 of that judgment)? The offhand way in which the Court referred to DPA supervision over data processing outside the EU in the Digital Rights Ireland judgment gives the impression that it was unaware of, or disinterested in, such questions.

On June 18 the Irish High Court referred a case to the CJEU that may develop further its line of thought in the Digital Rights Ireland judgment. The High Court’s judgment in Schrems v. Data Protection Commissioner involved a challenge by Austrian student Max Schrems to the transfer of personal data to the US by Facebook under the Safe Harbor. The High Court announced that it would refer to the CJEU the questions of whether the European Commission’s adequacy decision of 2000 creating the Safe Harbor should be re-evaluated in light of the Charter of Fundamental Rights and widespread access to data by US law enforcement, and of whether the individual DPAs should be allowed to determine whether the Safe Harbor provides adequate protection (see paragraphs 71 and 84). The linkage between the two cases is evidenced by the Irish High Court’s frequent citation of Digital Rights Ireland, and by the CJEU’s conclusion that interference with the right to data protection caused by widespread data retention for law enforcement purposes without notice being given to individuals was “particularly serious” (see paragraph 37 of Digital Rights Ireland and paragraph 44 of Schrems v. Data Protection Commissioner). The High Court also criticized the Safe Harbor and the system of oversight of law enforcement data access in the US as failing to provide oversight “carried out on European soil” (paragraph 62), which seems inspired by paragraph 68 of the Digital Rights Ireland judgment.

The Irish referral to the CJEU also holds implications for the possibility of harmonized EU rules regarding international data transfers. If each DPA is allowed to override Commission adequacy decisions based on its individual view of what the Charter of Fundamental Rights requires, then there would be no point to such decisions in the first place (and the current disagreement over the “one stop shop” in the context of the proposed EU General Data Protection Regulation shows the difficulty of reaching agreement on pan-European rules where fundamental rights are at stake). Also, one wonders if other data transfer mechanisms beyond the Safe Harbor could also be at risk (e.g., standard contractual clauses, binding corporate rules, etc.), given that they also allow data to be turned over to non-EU law enforcement authorities. The proposed EU General Data Protection Regulation could eliminate some of these risks, but its passage is still uncertain, and the interpretation by the Court of the role of the Charter of Fundamental Rights would still be relevant under it. Whatever the CJEU eventually decides, it seems inevitable that the case will result in a tightening of EU rules on international data transfers.

The referral by the Irish High Court also raises the question (which the High Court did not address) of how other important fundamental rights, such as freedom of expression and the right to communicate internationally (meaning, in essence, the freedom to communicate on the Internet), should be balanced with the right to data protection. In its recent jurisprudence, the CJEU seems to regard data protection as a “super right” that has preference over other ones; thus, in its recent judgment in the case C-131/12 Google Spain v. AEPD and Mario Costeja Gonzalez involving the “right to be forgotten”, the Court never even refers to Article 11 of the Charter of Fundamental Rights that protects freedom of expression and the right to “receive and impart information and ideas without interference by public authority and regardless of frontiers”. In its zeal to protect personal data transferred outside the EU, it is important that the CJEU not forget that, as it has stated in the past, data protection is not an absolute right, and must be considered in relation to its function in society (see, for example, Joined Cases C-92/09 and C-93/09 Volker und Markus Schecke, paragraph 48), and that there must be some territorial limit to EU data protection law, if it is not to become a system of universal application that applies to the entire world (as the Court held in Lindqvist). Thus, there is an urgent need for an authoritative and dispassionate analysis of the territorial limits to EU data protection law, and of how a balance can be struck between data protection and other fundamental rights, guidance which unfortunately the CJEU seems unwilling to provide.