Category: Surveillance

1

UCLA Law Review Vol. 64, Discourse

Volume 64, Discourse

Citizens Coerced: A Legislative Fix for Workplace Political Intimidation Post-Citizens United Alexander Hertel-Fernandez & Paul Secunda 2
Lessons From Social Science for Kennedy’s Doctrinal Inquiry in Fisher v. University of Texas II Liliana M. Garces 18
Why Race Matters in Physics Class Rachel D. Godsil 40
The Indignities of Color Blindness Elise C. Boddie 64
The Misuse of Asian Americans in the Affirmative Action Debate Nancy Leong 90
How Workable Are Class-Based and Race-Neutral Alternatives at Leading American Universities? William C. Kidder 100
Mismatch and Science Desistance: Failed Arguments Against Affirmative Action Richard Lempert 136
Privileged or Mismatched: The Lose-Lose Position of African Americans in the Affirmative Action Debate Devon W. Carbado, Kate M. Turetsky, Valerie Purdie-Vaughns 174
The Right to Record Images of Police in Public Places: Should Intent, Viewpoint, or Journalistic Status Determine First Amendment Protection? Clay Calvert 230
A Worthy Object of Passion Seana Valentine Shiffrin 254
Foreword – Imagining the Legal Landscape: Technology and the Law in 2030 Jennifer L. Mnookin & Richard M. Re i
Imagining Perfect Surveillance
Richard M. Re 264
Selective Procreation in Public and Private Law Dov Fox 294
Giving Up On Cybersecurity Kristen E. Eichensehr 320
DNA in the Criminal Justice System: A Congressional Research Service Report* (*From the Future) Erin Murphy 340
Utopia?: A Technologically Determined World of Frictionless Transactions, Optimized Production, and Maximal Happiness Brett Frischmann and Evan Selinger 372
The CRISPR Revolution: What Editing Human DNA Reveals About the Patent System’s DNA Robin Feldman 392
Virtual Violence Jaclyn Seelagy 412
Glass Half Empty Jane R. Bambauer 434
Social Control of Technological Risks: The Dilemma of Knowledge and Control in Practice, and Ways to Surmount It Edward A. Parson 464
Two Fables Christopher Kelty 488
Policing Police Robots Elizabeth E. Joh 516
Environmental Law, Big Data, and the Torrent of Singularities William Boyd 544
0

FAN (First Amendment News, Special Series) Newseum Institute to Host Event on Cell Phone Privacy vs National Security Controversy

images

Starting today and continuing through mid-June, I will post a special series of occasional blogs related to the Apple iPhone national security controversy and the ongoing debate surrounding it, even after the FBI gained access to the phone used by the terrorist gunman in the December shooting in San Bernardino, California.

Gene Policinski

Gene Policinski

This special series is done in conjunction with the Newseum Institute and a major program the Institute will host on June 15, 2016 in Washington, D.C.

I am pleased to be working with Gene Policinski (the chief operating officer of the Newseum Institute) and Nan Mooney (a D.C. lawyer and former law clerk to Chief Judge James Baker of the U.S. Court of Appeals for the Armed Forces) in organizing the event.

The June 15th event will be a moot court with seven Supreme Court Justices and two counsel for each side. The focus will be on the First Amendment issues raised in the case. (See below re links to the relevant legal documents).

→ Save the Date: Wednesday, June 15, 2016 @ 2:00 p.m., Newseum, Washington, D.C. (more info forthcoming).

The Apple-FBI clash was the first significant skirmish — and probably not much more than that — of the Digital Age conflicts we’re going to see in this century around First Amendment freedoms, privacy, data aggregation and use, and even the extent of religious liberty. As much as the eventual outcome, we need to get the tone right, from the start — freedom over simple fear. –Gene Policinski

Newseum Institute Moot Court Event

It remains a priority for the government to ensure that law enforcement can obtain crucial digital information to protect national security and public safety, either with cooperation from relevant parties, or through the court system when cooperation fails.Melanie Newman (spokeswoman for Justice Department, 3-28-16)

As of this date, the following people have kindly agreed to participate as Justices for a seven-member Court:

The following two lawyers have kindly agreed to serve as the counsel (2 of 4) who will argue the matter:

→ Two additional Counsel to be selected.  

Nan Mooney and I will say more about both the controversy and the upcoming event in the weeks ahead in a series of special editions of FAN. Meanwhile, below is some relevant information, which will be updated regularly.

Apple vs FBI Director James Comey

President Obama’s Statement

Congressional Hearing

Documents

Screen Shot 2016-03-17 at 10.46.11 PM

Last Court Hearing: 22 March 2016, before Judge Sheri Pym

Podcast

Video

News Stories & Op-Eds

lockediphone5c

  1. Pierre Thomas & Mike Levine, “How the FBI Cracked the iPhone Encryption and Averted a Legal Showdown With Apple,” ABC News, March 29, 2016
  2. Bruce Schneier, “Your iPhone just got less secure. Blame the FBI,” Washington Post, March 29, 2016
  3. Katie Benner & Eric Lichtblau, “U.S. Says It Has Unlocked Phone Without Help From Apple,” New York Times, March 8, 2016
  4. John Markoff, Katie Benner & Brian Chen, “Apple Encryption Engineers, if Ordered to Unlock iPhone, Might Resist,” New York Times, March 17, 2016
  5. Jesse Jackson, “Apple Is on the Side of Civil Rights,” Time, March 17, 2016
  6. Katie Benner & Eric Lichtblau, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” New York Times, March 15, 2016
  7. Kim Zetter, “Apple and Justice Dept. Trade Barbs in iPhone Privacy Case,” Wired, March 15, 2016
  8. Alina Selyukh, “Apple On FBI iPhone Request: ‘The Founders Would Be Appalled,‘” NPR, March 15, 2016
  9. Howard Mintz, “Apple takes last shot at FBI’s case in iPhone battle,” San Jose Mercury News, March 15, 2016
  10. Russell Brandom & Colin Lecher, “Apple says the Justice Department is using the law as an ‘all-powerful magic wand‘,” The Verge, March 15, 2016
  11. Adam Segal & Alex Grigsby, “3 ways to break the Apple-FBI encryption deadlock,” Washington Post, March 14, 2016
  12. Seung Lee, “Former White House Official Says NSA Could Have Cracked Apple-FBI iPhone Already,” Newsweek, March 14, 2016
  13. Tim Bajarin, “The FBI’s Fight With Apple Could Backfire,” PC, March 14, 2016
  14. Alina Selyukh, “U.S. Attorneys Respond To Apple In Court, Call Privacy Concerns ‘A Diversion’,” NPR, March 10, 2016
  15. Dan Levine, “San Bernardino victims to oppose Apple on iPhone encryption,” Reuters, Feb. 22, 2016
  16. Apple, The FBI And iPhone Encryption: A Look At What’s At Stake,” NPR, Feb. 17, 2016
0

The “We” That Haunts Our Digital Age

“Seen from this angle, the emphasis on what we must do as ethical selves, each and every one of us—us digital subjects, with our desires and our disobedience—may be precisely what is necessary for us to begin to think of ourselves as we. Yes, as that we that has been haunting this book since page one.”

                                                                           — Exposed: Desire and Disobedience in the Digital Age, p. 283.

In her brilliant post, Mary Ann Franks highlights the unequal exposure at the heart of our digital age and puts her finger on the most important sentence of Exposed: there is indeed a “we” that haunts this book, that haunts our digital age in fact, and it is precisely that “we” that we must keep at the very heart of our digital debates.

As Franks highlights and as David Pozen and Olivier Sylvain earlier suggested, the digital world is by no means an undifferentiated space. In Exposed, I underscore those differences. In the “The Mortification of the Self,” I underline how our digital world cuts deeply along lines of class and gender. In “The Steel Mesh,” I emphasize how our digital exposure is deeply differentiated by race and ethnicity. In “The Collapse of State, Economy, and Society,” I detail the labor, wealth, and disability effects. The NYPD social media unit does not simply target anyone, it targets minority suspects, especially “crew members…. They listen to the lyrical taunts of local rap artists, some affiliated with crews, and watch YouTube for clues to past trouble and future conflicts.” (Exposed, 243) The cameramen behind the CCTV’s don’t target anyone, but women sunbathing, and the “police officers radio each other to say ‘oh there’s a MILF over here, come over here.’” (230) Many of the women respondents in studies recount being “seen on camera and identified as not having had a top on in a park, even though they had a bikini top on— with the police saying repeatedly, ‘We just saw you on camera’” (230) The surveillants also target those in “hoodies, tracksuits, or trainers,” signs of the more popular classes. And of course, the NSA targets Muslim radicalizers. (247)

The steel mesh that surrounds us is by no means color-blind, and neither are the new forms of GPS monitoring. “In 2008, one out of nine young adult black men between the ages of twenty and thirty four—or approximately 11 percent of that population— was incarcerated in prison or jail in the United States. As of 2011, more than 2 million African American men were either behind bars or under correctional supervision (that is, had been arrested and processed by the criminal justice system and were on probation, on parole, or behind bars). That too represents about 11 percent of the total population of black men—one out of nine.” (235) And there are increasingly gender disparities in the carceral sphere: “The Immigrations and Customs Enforcement (ICE) field office in New York dramatically increased the number of women monitored by GPS-enabled ankle bracelets, up roughly 4,000 percent in 2104 alone, from 18 to 719.” (238)

Indeed, our digital world is becoming, for many, a steel mesh. “We watch and are watched, we knowingly strap surveillance devices on our bodies— and then some of us are arrested, some of us are disconnected, some of us are extracted.” (253) Yes, some of us, but of course, not all, and we tend to know who. As I suggested in my last post on the “Damn Daniel!” phenomenon, the digital space elides all kinds of race and class distinctions, but we need to resist that and bring it to the surface.

Despite all the unequal exposure, we need to speak as a “we,” not as that “they” that Franks concludes with. Not only because even those of us who are being surveilled and punished are at times exposing ourselves and also watching others, but because it is only as a “we” that we will be able to address the excesses of our expository society, each and every one of us—us digital subjects, with our desires and our disobedience.

This has been a thrilling symposium and I thank my interlocutors—Lisa Austin, Ann Bartow, Mary Ann Franks, Solangel Maldonado, Frank Pasquale, David Pozen, and Daniel Solove—immensely. I have learned a lot.

0

Unequal Exposure

Towards the end of the breathless and impassioned tour through privacy, surveillance, carcerality, and desire that is Exposed, Bernard Harcourt writes that “the emphasis on what we must do as ethical selves, each and every one of us – us digital subjects – may be precisely what is necessary for us to begin to think of ourselves as we. Yes, as that we that has been haunting this book since page one” (283). The call for unity and solidarity is seductive: if “we” are all exposed and vulnerable, then “we” can all resist and demand change. But that “we” – that reassuring abstraction of humanity and human experience – is not in fact what haunts this book. That “we” – unquestioned, undifferentiated, unmarked – is taken for granted and treated as the universal subject in this book. What truly haunts this book is everything that this “we” obscures and represses. Harcourt’s “we” is remarkably undifferentiated. Nearly every point Harcourt makes about how “we” experience digital subjectivity, surveillance, and exposure would and should be contested by women, people of color, the poor, sexual minorities (and those who belong to more than one of the categories in this non-exhaustive list). It is unfair, of course, to expect any one book or any one author to capture the full complexity of human experience on any topic. One writes about what one knows, and nuance must sometimes be sacrificed for the sake of broad theory. But there is a difference between falling short of conveying the diversity of human experience and barely acknowledging the existence of differentiation. If one of Harcourt’s goals is to lead us to “think of ourselves as we,” it is vital to recognize that  “we” in the digital age are not equally represented, equally consenting or resisting, or equally exposed.

Let’s begin with Harcourt’s characterization of the digital age as a study in shallow positivity: “We do not sing hate, we sing praise. We ‘like,’ we ‘share,’ we ‘favorite.’ We ‘follow.’ We ‘connect.’ ‘We get LinkedIn.’ Ever more options to join and like and appreciate. Everything today is organized around friending, clicking, retweeting, and reposting. … We are appalled by mean comments – which are censored if they are too offensive”(41). This is a picture of the digital world that will be  unrecognizable to many people. There is no mention of online mobs, targeted harassment campaigns, career-destroying defamation, rape and death threats, doxxing, revenge porn, sex trafficking, child porn, online communities dedicated to promoting sexual violence against women, or white supremacist sites. No mention, in short, of the intense, destructive, unrelenting hatred that drives so much of the activity of our connected world. Harcourt’s vision of our digital existence as a sunny safe space where occasional “mean comments” are quickly swept from view is nothing short of extraordinary.

Next, consider Harcourt’s repeated insistence that there are no real distinctions between exposer and exposed, the watcher and the watched: “There is no clean division between those who expose and those who surveil; surveillance of others has become commonplace today, with nude pictures of celebrities circulating as ‘trading fodder’ on the more popular anonymous online message boards, users stalking other users, and videos constantly being posted about other people’s mistakes, accidents, rants, foibles, and prejudices. We tell stories about ourselves and others. We expose ourselves. We watch others” (129). There are, in fact, important divisions between exposers and the exposed. With regard to sexual exposure, it is overwhelmingly the case that women are the subjects and not the agents of exposure. The nude photos to which Harcourt refers weren’t of just any celebrities; they were with few exceptions female celebrities. The hacker in that case, as in nearly every other case of nude photo hacking, is male, as is nearly every revenge porn site owner and the majority of revenge porn consumers. The “revenge porn” phenomenon itself, more accurately described as “nonconsensual pornography,” is overwhelmingly driven by men exposing women, not the other way around. Many of Harcourt’s own examples of surveillance point to the gender imbalance at work in sexual exposure. The LOVEINT scandal, the CCTV cameras pointed into girls’ toilets and changing rooms in UK schools (229), and Edward Snowden’s revelations of how the NSA employees share naked pictures (230) primarily involve men doing the looking and women and girls being looked at. The consequences of sexual exposure are also not gender-neutral: while men and boys may suffer embarrassment and shame, girls and women suffer these and much more, including being expelled from school, fired from jobs, tormented by unwanted sexual propositions, and threatened with rape.

There are also important distinctions to be made between those who voluntarily expose themselves and those who are exposed against their will. In the passage above, Harcourt puts nude photos in the same list as videos of people’s “rants, foibles, and prejudices.” The footnote to that sentence provides two specific examples: Jennifer Lawrence’s hacked photos and video of Michael Richards (Seinfeld’s Kramer) launching into a racist tirade as he performed at a comedy club (311). That is a disturbing false equivalence. The theft of private information is very different from a public, voluntary display of racist hatred. In addition to the fact that naked photos are in no way comparable to casual references to lynching and the repeated use of racial slurs, it should matter that Jennifer Lawrence was exposed against her will and Michael Richards exposed himself.

It’s not the only time in the book that Harcourt plays a bit fast and loose with the concepts of consent and voluntariness. In many places he criticizes “us” for freely contributing to our own destruction: “There is hardy any need for illicit or surreptitious searches, and there is little need to compel, to pressure, to strong-arm, or to intimidate, because so many of us are giving all our most intimate information and whereabouts so willingly and passionately – so voluntarily” (17).  And yet Harcourt also notes that in many cases, people do not know that they are being surveilled or do not feel that they have any practical means of resistance. “The truth is,” Harcourt tells us with regard to the first, “expository power functions best when those who are seen are not entirely conscious of it, or do not always remember. The marketing works best when the targets do not know that they are being watched” (124). On the second point, Harcourt observes that “when we flinch at the disclosure, most of us nevertheless proceed, feeling that we have no choice, not knowing how not to give our information, whom we would talk to, how to get the task done without the exposure. We feel we have no other option but to disclose” (181-2). But surely if people are unaware of a practice or feel they cannot resist it, they can hardly be considered to have voluntarily consented to it.

Also, if people often do not know that they are under surveillance, this undermines one of the more compelling concerns of the book, namely, that surveillance inhibits expression. It is difficult to see how surveillance could have an inhibiting effect if the subjects are not conscious of the fact that they are being watched. Surreptitious surveillance certainly creates its own harms, but if subjects are truly unaware that they are being watched – as opposed to not knowing exactly when or where surveillance is taking place but knowing that it is taking place somewhere somehow, which no doubt does create a chilling effect – then self-censorship is not likely to be one of them.

Harcourt suggests a different kind of harm when he tells us that “[i]nformation is more accessible when the subject forgets that she is being stalked” (124). That is, we are rendered more transparent to the watchers when we falsely believe they are not watching us. That seems right. But what exactly is the harm inflicted by this transparency? Harcourt warns that we are becoming “marketized subjects – or rather subject-objects who are nothing more than watched, tracked, followed, profiled at will, and who in turn do nothing more than watch and observe others” (26). While concerns about Big Data are certainly legitimate (and have been voiced by many scholars, lawyers, policymakers, and activists), Harcourt never paints a clear picture of what he thinks the actual harm of data brokers and targeted Target advertisements really is. In one of the few personal and specific examples he offers of the harms of surveillance, Harcourt describes the experience of being photographed by a security guard before a speaking engagement. Harcourt is clearly unsettled by the experience: “I could not resist. I did not resist. I could not challenge the security protocol. I was embarrassed to challenge it, so I gave in without any resistance. But it still bothers me today. Why? Because I had no control over the dissemination of my own identity, of my face. Because I felt like I had no power to challenge, to assert myself” (222). While one sympathizes with Harcourt’s sense of disempowerment, it is hard to know what to think of it in relation to the sea of other surveillance stories: women forced to flee their homes because of death threats, parents living in fear because the names of their children and the schools they attend have been published online, or teenaged girls committing suicide because the photo of their rape is being circulated on the Internet as a form of entertainment.

Harcourt uses the term “stalk” at least eight times in this book, and none of these references are to actual stalking, the kind that involves being followed by a particular individual who knows where you live and work and means you harm, the kind that one in six women in the U.S. will experience in her lifetime, the kind that is encouraged and facilitated by an ever-expanding industry of software, gadgets, and apps that openly market themselves to angry men as tools of control over the women who have slipped their grasp. What a privilege it is to be able to treat stalking not as a fact of daily existence, but as a metaphor.

Harcourt’s criticism of what he considers to be the Supreme Court’s lack of concern for privacy adds a fascinating gloss to all of this. Harcourt takes particular aim at Justice Scalia, asserting that even when Scalia seems to be protecting privacy, he is actually disparaging it: “Even in Kyllo v. United States…. where the Court finds that the use of heat-seeking technology constitutes a search because it infringes on the intimacies of the home, Justice Scalia mocks the humanist conception of privacy and autonomy.” The proof of this assertion supposedly comes from Scalia’s observation that the technology used in that case “might disclose, for example, at what hour each night the lady of the house takes her daily sauna and bath – a detail that many would consider ‘intimate.’” Harcourt assumes that Scalia’s reference to the “lady of the house” is an ironic expression of contempt. But Scalia is not being ironic. Elsewhere in the opinion, he emphatically states that “[i]n the home… all details are intimate details,” and many of his other opinions reinforce this view. Scalia and many members of the Court are very concerned about privacy precisely when it involves the lady of the house, or the homeowner subjected to the uninvited drug-sniffing dog on the porch (Florida v. Jardines, 2013), or the federal official subjected to the indignity of a drug test (Treasury Employees v. Von Raab, 1989 (dissent)). These same members of the Court, however, are remarkably unconcerned about privacy when it involves a wrongfully arrested man subjected to a humiliating “squat and cough” cavity search (Florence v. Burlington, 2012), or a driver searched after being racially profiled (Whren v. US, 1996), a pregnant woman tricked into a drug test while seeking prenatal care (Ferguson v. Charleston, 2001 (dissent)). In other words, the problem with the Supreme Court’s views on privacy and surveillance is not that it does not care about it; it’s that it tends to care about it only when it affects interests they share or people they resemble.

The world is full of people who do not have the luxury of worrying about a growing addiction to Candy Crush or whether Target knows they need diapers before they do. They are too busy worrying that their ex-husband will hunt them down and kill them, or that they will be stopped and subjected to a humiliating pat down for the fourth time that day, or that the most private and intimate details of their life will be put on public display by strangers looking to make a buck. These people are not driven by a desire to expose themselves. Rather, they are being driven into hiding, into obscurity, into an inhibited and chilled existence, by people who are trying to expose them. If “we” want to challenge surveillance and fight for privacy, “they” must be included.

0

The Fragility of Desire

In his excellent new book Exposed, Harcourt’s analysis of the role of desire in what he calls the “expository society” of the digital age is seductive. We are not characters in Orwell’s 1984, or prisoners of Bentham’s Panopticon, but rather are enthusiastic participants in a “mirrored glass pavilion” that is addictive and mesmerizing. Harcourt offers a detailed picture of this pavilion and also shows us the seamy side of our addiction to it. Recovery from this addiction, he argues, requires acts of disobedience but there lies the great dilemma and paradox of our age: revolution requires desire, not duty, but our desires are what have ensnared us.

I think that this is both a welcome contribution as well as a misleading diagnosis.

There have been many critiques of consent-based privacy regimes as enabling, rather than protecting, privacy. The underlying tenor of many of these critiques is that consent fails as a regulatory tool because it is too difficult to make it truly informed consent. Harcourt’s emphasis on desire shows why there is a deeper problem than this, that our participation in the platforms that surveil us is rooted in something deeper than misinformed choice. And this makes the “what to do?” question all the more difficult to answer. Even for those of us who see a stronger role for law than Harcourt outlines in this book (I agree with Ann Bartow’s comments on this) should pause here. Canada, for example, has strong private sector data protections laws with oversight from excellent provincial and federal privacy commissioners. And yet these laws are heavily consent-based. Such laws are able to shift practices to a stronger emphasis on things like opt-in consent, but Harcourt leaves us with a disquieting sense that this might just be just an example of a Pyrrhic victory, legitimizing surveillance through our attempts to regulate it because we still have not grappled with the more basic problem of the seduction of the mirrored glass pavilion.

The problem with Harcourt’s position is that, in exposing this aspect of the digital age in order to complicate our standard surveillance tropes, he risks ignoring other sources of complexity that are also important for both diagnosing the problem and outlining a path forward.

Desire is not always the reason that people participate in new technologies. As Ann Bartow and Olivier Sylvain point out, people do not always have a choice about their participation in the technologies that track us. The digital age is not an amusement park we can choose to go to or to boycott, but deeply integrated into our daily practices and needs, including the ways in which we work, bank, and access government services.

But even when we do actively choose to use these tools, it is not clear that desire captures the why of all such choices. If we willingly enter Harcourt’s mirrored glass pavilion, it is sometimes because of some of its very useful properties — the space- and time-bending nature of information technology. For example, Google calendar is incredibly convenient because multiple people can access shared calendars from multiple devices in multiple locations at different times making the coordination of calendars incredibly easy. This is not digital lust, but digital convenience.

These space- and time-bending properties of information technology are important for understanding the contours of the public/private nexus of surveillance that so characterizes our age. Harcourt does an excellent job at pointing out some of the salient features of this nexus, describing a “tentacular oligarchy” where private and public institutions are bound together in state-like “knots of power,” with individuals passing back and forth between these institutions. But what is strange in Harcourt’s account is that this tentacular oligarchy still appears to be bounded by the political borders of the US. It is within those borders that the state and the private sector have collapsed together.

What this account misses is the fact that information technology has helped to unleash a global private sector that is not bounded by state borders. In this emerging global private sector large multinational corporations often operate as “metanationals” or stateless entities. The commercial logic of information is that it should cross political borders with ease and be stored wherever it makes the most economic sense.

Consider some of the rhetoric surrounding the e-commerce chapter of the recent TPP agreement. The Office of the US Trade Representative indicates that one of its objectives is to keep the Internet “free and open” which it has pursued through rules that favour cross-border data flows and prevent data localization. It is easy to see how this idea of “free” might be confused with political freedom, for an activist in an oppressive regime is better off in exercising freedom of speech when that speech can cross political borders or the details of their communications can be stored in a location that is free of the reach of their state. A similar rationale has been offered by some in the current Apple encryption debate — encryption protects American business people communicating within China and we can see why that is important.

But this idea of freedom is the freedom of a participant in a global private sector with weak state control; freedom from the state control of oppressive regimes also involves freedom from the state protection of democratic regimes.

If metanationals pursue a state-free agenda, the state pursues an agenda of rights-protectionism. By rights protectionism I mean the claim that states do, and should, protect the constitutional rights of their own citizens and residents but not others. Consider, for example, a Canadian citizen who resides in Canada and uses US cloud computing. That person could be communicating entirely with other Canadians in other Canadian cities and yet have all of their data stored in the US-based cloud. If the US authorities wanted access to that data, the US constitution would not apply to regulate that access in a rights-protecting manner because the Canadian is a non-US person.

Many see result as flowing from the logic of the Verdugo-Urquidez case. Yet that case concerned a search that occurred in a foreign territory (Mexico), rather than within the US, where the law of that territory continued to apply. The Canadian constitution does not apply to acts of officials within the US. The data at issue falls into a constitutional black hole where no constitution applies (and maybe even international human rights black hole according to some US interpretations of extraterritorial obligations). States can then collect information within this black hole free of the usual liberal-democratic constraints and share it with other allies, a situation Snowden documented within the EU and likened to a “European bazaar” of surveillance.

Rights protectionism is not rights protection when information freely crosses political boundaries and state power piggybacks on top of this crossing and exploits it.

This is not a tentacular oligarchy operating within the boundaries of one state, but a series of global alliances – between allied states and between states and metanationals who exert state-like power — exploiting the weaknesses of state-bound law.

We are not in this situation simply because of a penchant for selfies. But to understand the full picture we do need to look beyond “ourselves” and get the global picture in view. We need to understand the ways in which our legal models fail to address these new realities and even help to mask and legitimize the problems of the digital age through tools and rhetoric that are no longer suitable.

Lisa Austin is an Associate Professor at the University of Toronto Faculty of Law.

A Social Theory of Surveillance

LocksBernard Harcourt’s Exposed is a deeply insightful analysis of data collection, analysis, and use by powerful commercial and governmental actors.  It offers a social theory of both surveillance and self-exposure. Harcourt transcends methodological individualism by explaining how troubling social outcomes can be generated by personal choices that each seem rational at the time they are made. He also helps us understand why ever more of daily life is organized around the demands of what Shoshanna Zuboff calls “surveillance capitalism:” intimate monitoring of our daily lives to maximize our productivity as consumers and workers.

The Chief Data Scientist of a Silicon Valley firm told Zuboff, “The goal of everything we do is to change people’s actual behavior at scale. When people use our app, we can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad. We can test how actionable our cues are for them and how profitable for us.” Harcourt reflects deeply on what it means for firms and governments to “change behavior at scale,” identifying “the phenomenological steps of the structuration of the self in the age of Google and NSA data-mining.”

Harcourt also draws a striking, convincing analogy between Erving Goffman’s concept of the “total institution,” and the ever-denser networks of sensors and training (both in the form of punishments and lures) that powerful institutions use to assure behavior occurs within ranges of normality. He observes that some groups are far more likely to be exposed to pain or inconvenience from the surveillance apparatus, while others enjoy its blandishments in relative peace. But almost no one can escape its effects altogether.

In the space of a post here, I cannot explicate Harcourt’s approach in detail. But I hope to give our readers a sense of its power to illuminate our predicament by focusing on one recent, concrete dispute: Apple’s refusal to develop a tool to assist the FBI’s effort to reveal the data in an encrypted iPhone. The history Harcourt’s book recounts helps us understand why the case has attracted so much attention—and how it may be raising false hopes.

Read More

2

Irresistible Surveillance?

Bernard Harcourt’s Exposed: Desire and Disobedience in the Digital Age offers many intriguing insights into how power circulates in contemporary society.  The book’s central contribution, as I see it, is to complicate the standard model of surveillance by introducing the surveilled’s agency into the picture.  Exposed highlights the extent to which ordinary people are complicit in regimes of data-monitoring and data-mining that damage their individual personhood and the democratic system.  Millions upon millions of “digital subjects,” Harcourt explains, have come to embrace forms of exposure that commoditize their own privacy.  Sometimes people do this because they want more convenience when they navigate capitalist culture or government bureaucracies.  Or because they want better book recommendations from Amazon.  Other times, people wish to see and be seen online—increasingly feel they need to be seen online—in order to lead successful social and professional lives.

So complicit are we in the erosion of our collective privacy, Harcourt suggests, that any theory of the “surveillance state” or the “surveillance industrial complex” that fails to account for these decentralized dynamics of exhibition, spectacle, voyeurism, and play will misdiagnose our situation.  Harcourt aligns himself at times with some of the most provocative critics of intelligence agencies like the NSA and companies like Facebook.  Yet the emphasis he places on personal desire and participatory disclosure belies any Manichean notion of rogue institutions preying upon ignorant citizens.  His diagnosis of how we’ve lost our privacy is more complex, ethically and practically, in that it forces attention on the ways in which our current situation is jointly created by bottom-up processes of self-exposure as well as by top-down processes of supervision and control.

Thus, when Harcourt writes in the introduction that “[t]here is no conspiracy here, nothing untoward,” what might seem like a throwaway line is instead an important descriptive and normative position he is staking out about the nature of the surveillance problem.  Exposed calls on critics of digital surveillance to adopt a broader analytic lens and a more nuanced understanding of causation, power, and responsibility.  Harcourt in this way opens up fruitful lines of inquiry while also, I think, opening himself up to the charge of victim-blaming insofar as he minimizes the social and technological forces that limit people’s capacity to change their digital circumstances.

The place of desire in “the expository society,” Harcourt shows, requires rethinking of our metaphors for surveillance, discipline, and loss of privacy.  Exposed unfolds as a series of investigations into the images and tropes we conventionally rely on to crystallize the nature of the threat we face: Big Brother, the Panopticon, the Surveillance State, and so forth.  In each case, Harcourt provides an erudite and sympathetic treatment of the ways in which these metaphors speak to our predicament.  Yet in each case, he finds them ultimately wanting.  For instance, after the Snowden disclosures began to garner headlines, many turned to George Orwell’s novel 1984 to help make sense of the NSA’s activities.  Book sales skyrocketed.  Harcourt, however, finds the Big Brother metaphor to be misleading in critical respects.  As he reminds us, Big Brother sought to wear down the citizens of Oceania, neutralize their passions, fill them with hate.  “Today, by contrast, everything functions by means of ‘likes,’ ‘shares,’ ‘favorites,’ ‘friending,’ and ‘following.’  The drab blue uniform and grim gray walls in 1984 have been replaced by the iPhone 5C in all its radiant colors . . . .”  We are in a new condition, a new paradigm, and we need a new language to negotiate it.

Harcourt then considers a metaphor of his own devising: the “mirrored glass pavilion.”  This metaphor is meant to evoke a sleek, disorienting, commercialized space in which we render ourselves exposed to the gaze of others and also, at the same time, to ourselves.  But Harcourt isn’t quite content to rest with this metaphor either.  He introduces the mirrored glass pavilion, examines it, makes a case for it, and keeps moving—trying out metaphors like “steel mesh” and “data doubles” and (my favorite) “a large oligopolistic octopus that is enveloping the world,” all within the context of the master metaphor of an expository society.  Metaphors, it seems, are indispensable if imperfect tools for unraveling the paradoxes of digital life.

The result is a restless, searching quality to the analysis.  Exposed is constantly introducing new anecdotes, examples, paradigms, and perspectives, in the manner of a guided tour.  Harcourt is clearly deeply unsettled by the digital age we have entered.  Part of the appeal of the book is that he is willing to leave many of his assessments unsettled too, to synthesize a vast range of developments without simplifying or prophesizing.

Another aspect of Exposed that enhances its effect is the prose style.  Now, I wouldn’t say that Harcourt’s Foucault-fueled writing has ever suffered from a lack of flair.  But in this work, Harcourt has gone further and become a formal innovator.  He has developed a prose style that uncannily mimics the experience of the expository society, the phenomenology of the digital subject.

Throughout the book, when describing the allure of new technologies that would rob us of our privacy and personhood, the prose shifts into a different register.  The reader is suddenly greeted with quick staccato passages, with acronyms and brand names thrown together in a dizzying succession of catalogs and subordinate clauses.  In these passages, Harcourt models for us the implicit bargain offered by the mirrored glass pavilion—inviting us to suspend critical judgment, to lose ourselves, as we get wrapped up in the sheer visceral excitement, the mad frenzy, of digital consumer culture.

Right from the book’s first sentence, we confront this mimetic style:

Every keystroke, each mouse click, every touch of the screen, card swipe, Google search, Amazon purchase, Instagram, ‘like,’ tweet, scan—in short, everything we do in our new digital age can be recorded, stored, and monitored.  Every routine act on our iPads and tablets, on our laptops, notebooks, and Kindles, office PCs and smart-phones, every transaction with our debit card, gym pass, E-ZPass, bus pass, and loyalty cards can be archived, data-mined, and traced back to us.

Other sentences deploy a similar rhetorical strategy in a more positive key, describing how we now “‘like,’ we ‘share,’ we ‘favorite.’ We ‘follow.’ We ‘connect.’ We get ‘LinkedIn”—how “[e]verything today is organized around friending, clicking, retweeting, and reposting.”

There is a visceral pleasure to be had from abandoning oneself to the hyper-stimulation, the sensory overload, of passages like these.  Which is precisely the point.  For that pleasure underwrites our own ubiquitous surveillance and the mortification of self.  That pleasure is our undoing.  More than anything else, in Harcourt’s telling, it is the constant gratifications afforded by technologies of surveillance that have “enslaved us, exposed us, and ensnared us in this digital shell as hard as steel.”

*  *  *

I hope these brief comments have conveyed some of what I found so stimulating in this remarkable book.  Always imaginative and often impressionistic, Exposed is hazy on a number of significant matters.  In the hope of facilitating conversation, I will close by noting a few.

First, what are the distributional dimensions of the privacy crisis that we face?  The implicit digital subject of Exposed seems to be a highly educated, affluent type—someone who would write a blog post, wear an Apple Watch, buy books on Amazon.  There may well be millions of people like this; I don’t mean to suggest any narcissism in the book’s critical gaze.  I wonder, though, how the privacy pitfalls chronicled in Exposed relate to more old-fashioned forms of observation and exploitation that continue to afflict marginalized populations and that Harcourt has trenchantly critiqued in other work.

Second, what about all the purported benefits of digital technology, Big Data, and the like?  Some commentators, as Harcourt notes in passing, have begun to argue that panoptic surveillance, at least under the right conditions, can facilitate not only certain kinds of efficiency and security but also values such as democratic engagement and human freedom that Harcourt is keen to promote.  I share Harcourt’s skepticism about these arguments, but if they are wrong then we need to know why they are wrong, and in particular whether they are irredeemably mistaken or whether they might instead point us toward useful regulatory reforms.

And lastly, what would dissent look like in this realm?  The final, forward-looking chapter of Exposed is strikingly short, only four pages long.  Harcourt exhorts the reader to fight back through “digital resistance” and “political disobedience.”  But remember, there is no conspiracy here, nothing untoward.  Rather, there is a massively distributed and partially emergent system of surveillance.  And this system generates enormous perceived rewards, not just for those at the top but for countless individuals throughout society.  It is something of a puzzle, then, what would motivate the sort of self-abnegating resistance that Harcourt calls for—resistance that must be directed, in the first instance, toward our own compulsive habits and consumptive appetites.  How would that sort of resistance develop, in the teeth of our own desires, and how could it surmount collective action barriers?

These are just a few of the urgent questions that Exposed helps bring into focus.

*  *  *

David Pozen is an associate professor at Columbia Law School.

Bernard Harcourt Exposed 02
0

Surveillance and Our Addiction to Exposure

Bernard Harcourt ExposedBernard Harcourt’s Exposed: Desire and Disobedience in the Digital Age (Harvard University Press 2015) is an indictment of  our contemporary age of surveillance and exposure — what Harcourt calls “the expository society.” Harcourt passionately deconstructs modern technology-infused society and explains its dark implications with an almost poetic eloquence.

Harcourt begins by critiquing the metaphor of George Orwell’s 1984 to describe the ills of our world today.  In my own previous work, I critiqued this metaphor, arguing that Kafka’s The Trial was a more apt metaphor to capture the powerlessness and vulnerability that people experience as government and businesses construct and use “digital dossiers” about their lives.  Harcourt critiques Orwell in a different manner, arguing that Orwell’s dystopian vision is inapt because it is too drab and gray:

No, we do not live in a drab Orwellian world.  We live in a beautiful, colorful, stimulating, digital world that is online, plugged in, wired, and Wi-Fi enabled.  A rich, bright, vibrant world full of passion and jouissance–and by means of which we reveal ourselves and make ourselves virtually transparent to surveillance.  In the end, Orwell’s novel is indeed prescient in many ways, but jarringly off on this one key point.  (pp. 52-53)

City wet-868078_960_720 pixabay b

Orwell’s Vision

City NYC new-117018_960_720 pixabay b

Life Today

Neil Postman Amusing Ourselves to DeathHarcourt notes that the “technologies that end up facilitating surveillance are the very technologies we crave.”  We desire them, but “we have become, slowly but surely, enslaved to them.” (p. 52).

Harcourt’s book reminds me of Neil Postman’s Amusing Ourselves to Death, originally published about 30 years ago — back in 1985.  Postman also critiqued Orwell’s metaphor and argued that Aldous Huxley’s Brave New World was a more apt metaphor to capture the problematic effects new media technologies were having on society.

Read More

0

Symposium on Exposed: Desire and Disobedience in the Digital Age

Frank Pasquale and I are delighted to introduce Professor Bernard Harcourt and the participants of our online symposium on his provocative new book Exposed: Desire and Disobedience in the Digital Age (Harvard University Press 2015).  Here is the description of the book from HUP’s webpage:

Social media compile data on users, retailers mine information on consumers, Internet giants create dossiers of who we know and what we do, and intelligence agencies collect all this plus billions of communications daily. Exploiting our boundless desire to access everything all the time, digital technology is breaking down whatever boundaries still exist between the state, the market, and the private realm. Exposed offers a powerful critique of our new virtual transparence, revealing just how unfree we are becoming and how little we seem to care.

Bernard Harcourt guides us through our new digital landscape, one that makes it so easy for others to monitor, profile, and shape our every desire. We are building what he calls the expository society—a platform for unprecedented levels of exhibition, watching, and influence that is reconfiguring our political relations and reshaping our notions of what it means to be an individual.

We are not scandalized by this. To the contrary: we crave exposure and knowingly surrender our privacy and anonymity in order to tap into social networks and consumer convenience—or we give in ambivalently, despite our reservations. But we have arrived at a moment of reckoning. If we do not wish to be trapped in a steel mesh of wireless digits, we have a responsibility to do whatever we can to resist. Disobedience to a regime that relies on massive data mining can take many forms, from aggressively encrypting personal information to leaking government secrets, but all will require conviction and courage.

We are thrilled to be joined by an amazing group of scholars to discuss this groundbreaking work, including Concurring Opinions co-founder Daniel Solove, Frank Pasquale (the co-organizer of this symposium), Lisa Austin, Ann Bartow, Mary Anne Franks, David Pozen, Olivier Sylvain, and, of course, Bernard Harcourt.   They will be posting throughout the week so check in daily and as always, we encourage you to join the discussion.

The Emerging Law of Algorithms, Robots, and Predictive Analytics

In 1897, Holmes famously pronounced, “For the rational study of the law the blackletter man may be the man of the present, but the man of the future is the man of statistics and the master of economics.” He could scarcely envision at the time the rise of cost-benefit analysis, and comparative devaluation of legal process and non-economic values, in the administrative state. Nor could he have foreseen the surveillance-driven tools of today’s predictive policing and homeland security apparatus. Nevertheless, I think Holmes’s empiricism and pragmatism still animate dominant legal responses to new technologies. Three conferences this Spring show the importance of “statistics and economics” in future tools of social order, and the fundamental public values that must constrain those tools.

Tyranny of the Algorithm? Predictive Analytics and Human Rights

As the conference call states

Advances in information and communications technology and the “datafication” of broadening fields of human endeavor are generating unparalleled quantities and kinds of data about individual and group behavior, much of which is now being deployed to assess risk by governments worldwide. For example, law enforcement personnel are expected to prevent terrorism through data-informed policing aimed at curbing extremism before it expresses itself as violence. And police are deployed to predicted “hot spots” based on data related to past crime. Judges are turning to data-driven metrics to help them assess the risk that an individual will act violently and should be detained before trial. 


Where some analysts celebrate these developments as advancing “evidence-based” policing and objective decision-making, others decry the discriminatory impact of reliance on data sets tainted by disproportionate policing in communities of color. Still others insist on a bright line between policing for community safety in countries with democratic traditions and credible institutions, and policing for social control in authoritarian settings. The 2016 annual conference will . . . consider the human rights implications of the varied uses of predictive analytics by state actors. As a core part of this endeavor, the conference will examine—and seek to advance—the capacity of human rights practitioners to access, evaluate, and challenge risk assessments made through predictive analytics by governments worldwide. 

This focus on the violence targeted and legitimated by algorithmic tools is a welcome chance to discuss the future of law enforcement. As Dan McQuillan has argued, these “crime-fighting” tools are both logical extensions of extant technologies of ranking, sorting, and evaluating, and raise fundamental challenges to the rule of law: 

According to Agamben, the signature of a state of exception is ‘force-of’; actions that have the force of law even when not of the law. Software is being used to predict which people on parole or probation are most likely to commit murder or other crimes. The algorithms developed by university researchers uses a dataset of 60,000 crimes and some dozens of variables about the individuals to help determine how much supervision the parolees should have. While having discriminatory potential, this algorithm is being invoked within a legal context. 

[T]he steep rise in the rate of drone attacks during the Obama administration has been ascribed to the algorithmic identification of ‘risky subjects’ via the disposition matrix. According to interviews with US national security officials the disposition matrix contains the names of terrorism suspects arrayed against other factors derived from data in ‘a single, continually evolving database in which biographies, locations, known associates and affiliated organizations are all catalogued.’ Seen through the lens of states of exception, we cannot assume that the impact of algorithmic force-of will be constrained because we do not live in a dictatorship. . . .What we need to be alert for, according to Agamben, is not a confusion of legislative and executive powers but separation of law and force of law. . . [P]redictive algorithms increasingly manifest as a force-of which cannot be restrained by invoking privacy or data protection. 

The ultimate logic of the algorithmic state of exception may be a homeland of “smart cities,” and force projection against an external world divided into “kill boxes.” 


We Robot 2016: Conference on Legal and Policy Issues Relating to Robotics

As the “kill box” example suggests above, software is not just an important tool for humans planning interventions. It is also animating features of our environment, ranging from drones to vending machines. Ryan Calo has argued that the increasing role of robotics in our lives merits “systematic changes to law, institutions, and the legal academy,” and has proposed a Federal Robotics Commission. (I hope it gets further than proposals for a Federal Search Commission have so far!)


Calo, Michael Froomkin, and other luminaries of robotics law will be at We Robot 2016 this April at the University of Miami. Panels like “Will #BlackLivesMatter to RoboCop?” and “How to Engage the Public on the Ethics and Governance of Lethal Autonomous Weapons” raise fascinating, difficult issues for the future management of violence, power, and force.


Unlocking the Black Box: The Promise and Limits of Algorithmic Accountability in the Professions


Finally, I want to highlight a conference I am co-organizing with Valerie Belair-Gagnon and Caitlin Petre at the Yale ISP. As Jack Balkin observed in his response to Calo’s “Robotics and the Lessons of Cyberlaw,” technology concerns not only “the relationship of persons to things but rather the social relationships between people that are mediated by things.” Social relationships are also mediated by professionals: doctors and nurses in the medical field, journalists in the media, attorneys in disputes and transactions.


For many techno-utopians, the professions are quaint, an organizational form to be flattened by the rapid advance of software. But if there is anything the examples above (and my book) illustrate, it is the repeated, even disastrous failures of many computational systems to respect basic norms of due process, anti-discrimination, transparency, and accountability. These systems need professional guidance as much as professionals need these systems. We will explore how professionals–both within and outside the technology sector–can contribute to a community of inquiry devoted to accountability as a principle of research, investigation, and action. 


Some may claim that software-driven business and government practices are too complex to regulate. Others will question the value of the professions in responding to this technological change. I hope that the three conferences discussed above will help assuage those concerns, continuing the dialogue started at NYU in 2013 about “accountable algorithms,” and building new communities of inquiry. 


And one final reflection on Holmes: the repetition of “man” in his quote above should not go unremarked. Nicole Dewandre has observed the following regarding modern concerns about life online: 

To some extent, the fears of men in a hyperconnected era reflect all-too-familiar experiences of women. Being objects of surveillance and control, exhausting laboring without rewards and being lost through the holes of the meritocracy net, being constrained in a specular posture of other’s deeds: all these stances have been the fate of women’s lives for centuries, if not millennia. What men fear from the State or from “Big (br)Other”, they have experienced with men. So, welcome to world of women….

Dewandre’s voice complements that of US scholars (like Danielle Citron and Mary Ann Franks) on systematic disadvantages to women posed by opaque or distant technological infrastructure. I think one of the many valuable goals of the conferences above will be to promote truly inclusive technologies, permeable to input from all of society, not just top investors and managers.

X-Posted: Balkinization.