A Social Theory of Surveillance
Bernard Harcourt’s Exposed is a deeply insightful analysis of data collection, analysis, and use by powerful commercial and governmental actors. It offers a social theory of both surveillance and self-exposure. Harcourt transcends methodological individualism by explaining how troubling social outcomes can be generated by personal choices that each seem rational at the time they are made. He also helps us understand why ever more of daily life is organized around the demands of what Shoshanna Zuboff calls “surveillance capitalism:” intimate monitoring of our daily lives to maximize our productivity as consumers and workers.
The Chief Data Scientist of a Silicon Valley firm told Zuboff, “The goal of everything we do is to change people’s actual behavior at scale. When people use our app, we can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad. We can test how actionable our cues are for them and how profitable for us.” Harcourt reflects deeply on what it means for firms and governments to “change behavior at scale,” identifying “the phenomenological steps of the structuration of the self in the age of Google and NSA data-mining.”
Harcourt also draws a striking, convincing analogy between Erving Goffman’s concept of the “total institution,” and the ever-denser networks of sensors and training (both in the form of punishments and lures) that powerful institutions use to assure behavior occurs within ranges of normality. He observes that some groups are far more likely to be exposed to pain or inconvenience from the surveillance apparatus, while others enjoy its blandishments in relative peace. But almost no one can escape its effects altogether.
In the space of a post here, I cannot explicate Harcourt’s approach in detail. But I hope to give our readers a sense of its power to illuminate our predicament by focusing on one recent, concrete dispute: Apple’s refusal to develop a tool to assist the FBI’s effort to reveal the data in an encrypted iPhone. The history Harcourt’s book recounts helps us understand why the case has attracted so much attention—and how it may be raising false hopes.
Our Own Devices
As Exposed recounts, the Snowden revelations of 2013 were a critical moment in the history of surveillance. While isolated journalists and academics had been warning about widespread private sector assistance in illegal spying activities before 2013, the concerns were a fringe issue. The private contractor Snowden’s explosive breach of NSA security changed the public dialogue by exposing in-house documentation of both conscious cooperation from tech and communications giants, and easy co-optation of their networks. The deep intertwining of state and market, already obvious among carriers, became hard to ignore among Google, Apple, Facebook, Amazon, and even tech sector also-rans.
To regain user trust, Google and Apple in particular took various steps to harden their networks and devices from snooping. As Apple asserts in a brief, it “includes a setting that—if activated—automatically deletes encrypted data after ten consecutive incorrect attempts to enter the passcode.” The crux of Apple’s conflict with the FBI now is whether the Bureau can compel Apple, pursuant to the All Writs Act, to write software to disable this self-destruct function, and other security features.
From a purely legal perspective, the FBI’s prerogative here is a very close question. When it comes to policy, though, there is an emerging consensus: for the sake of privacy and cybersecurity in general, don’t force Apple to write this software. The slippery slope to generalized, judicially mandated decryption, readily coopted by hackers or foreign governments, is just too steep. I have seen this argument from the former head of the CIA and NSA (Michael Hayden), a commissioner of the FTC (Julie Brill), numerous academics, and a united front of privacy activists. I get why it’s appealing, particularly if the primary evils you are concerned with fighting are US law enforcement agencies’ growing disregard for 4th Amendment values, and data breaches by shadowy hackers.
On the other hand, President Obama warns that “You cannot take an absolutist view on this. If your argument is strong encryption no matter what, and we can and should create black boxes, that . . . [is] fetishizing our phones above every other value.” David Golumbia questions Apple’s slippery slope salvos. Nathan Newman has argued that such strong encryption could be used in a number of troubling contexts:
A tax evasion trial of top officials at the Swiss bank UBS highlighted the ways encryption stymie tax investigations, with witnesses detailing how private bankers hide sensitive client data under “Solitaire” game tabs on secret drives on encrypted laptops or code computers with emergency passwords to vaporize data on illegal activity.
Just last May, when Quebec tax authorities showed up at Uber Canadian offices, engineers in Uber’s San Francisco offices tried to remotely encrypt their data in Canada. Tax investigators were seeking to find out if the company was evading local sales tax rules and found that the devices were locking up as they were seeking to review the company’s data after a court order. . . . Unbreakable encryption is . . . a technological kludge that will abet more vicious problems of criminal, terrorist and corporate lawbreaking than any public good it might contribute.
Privacy activists are right to be concerned that the FBI has misused its authority in the past, and could easily do so in the future. However, I am sure they would feel frustrated if a large firm managed to encrypt all its privacy-violating activities with technology like Apple’s and evaded sanctions by doing so. Consider, for instance, a corporate wellness program that used Apple Watches as its platform for data gathering and dissemination. There is much room for mischief here, including discriminatory treatment of sick staff, made all the easier if data transfers are rendered invisible to all but those who make them.
In this post, I do not want to adjudicate the merits of Apple’s and the FBI’s positions. But I do want to use some of the critical tools offered by Exposed to complicate the debate.
The Encryption Arms Race
Former NSA and CIA head Michael Hayden surprised many when he weighed in against the FBI. He has stated that “America is simply more secure with unbreakable end-to-end encryption.” But note that by “unbreakable,” he does not mean “unbreakable by anyone.” He wants the NSA to request additional appropriations to hire cryptographers to hack Apple phones. Others say the NSA could hack the phone with its present capabilities.
Thus Hayden’s position seems to boil down to one of institutional competence. Since he sees the NSA as the “main body” in the cyberdomain, he wants it to continue in an arms race of cryptographic capabilities with leading firms like Apple. He scoffs at the civil libertarians who fought the clipper chip in the 1990s, explaining that “we got around that” with bulk collection capabilities. He seems confident that NSA can gain an edge over whatever encryption tools firms come up with. And anyone aware of the promiscuous data sharing among US agencies documented in Exposed can foresee the next step: following the recommendations of the 9/11 Commission report, the FBI will continue to “break down silos” between its data collection and that of other agencies. Indeed, we now know NSA “data will be shared with other intelligence agencies like FBI without first applying any screens for privacy.”
Simply adding some bureaucratic (and technological) impediments to the FBI’s access to such data does not seem to do justice to the concerns raised by the San Bernardino case. This “solution” reminds me of Harcourt’s description of some post-Snowden reforms proposed by the “President’s Review Group” in their report, Liberty and Security in a Changing World. The experts proposed that the government could directly reimburse carriers for retaining data, presumably on the assumption that situation would be superior to bulk metadata collection by government itself. Harcourt drily observes that this “would surely be a win-win solution for the surveillance-industrial complex” (290). But it’s of questionable value in guaranteeing just and stable entitlements to privacy.
The encryption/decryption arms race happily envisioned by Hayden also strikes me as less than useful. As Phil Rogaway warned in a recent essay “The Moral Character of Cryptographic Work,” we should be free to ask:
[What if such] computer science is not benefiting man? Technological pessimists like Jacques Ellul, Herbert Marcuse, and Lewis Mumford certainly didn’t think that it was. They saw modern technology as an interlocking, out-of-control system that, instead of fulfilling human needs, engendered pointless wants and deadlier weapons. . . .
From a US-centric perspective, a never-ending war-gaming of various encryption tactics between experts in Cupertino and Fort Meade may generate a community of practice better able to withstand, say, an onslaught of Chinese cyberattacks (darkly warned about by tech-funded think tanks opining on the future of war). But it is hard for me to distinguish the inevitable global responses to such developments as anything better than a new arms race, with ever higher stakes.
A world where the FBI must go to NSA for data like that in the San Bernardino iPhone might be better than one where the agency can, via the AWA, compel firms to undo their own encryption. Or a ruling adverse to the FBI might accelerate interagency collaborations toward a more robust “Information Sharing Environment.” An Apple win in #ApplevsFBI won’t magically create a wall of encryption around all our communications. Nor may it even keep information from the FBI itself. Nor is it even clear this information should be kept from the FBI as a normative or legal matter.
The landscape of modern surveillance is such a complex socio-technical-legal realm that every legal victory generates technical and social pushback, every technical advance provokes legal respones, and so on. The important point here is to follow Harcourt’s lead and try to see not only the vastness of the modern state surveillance appararatus, but also its ongoing, symbiotic relationship with the so-called private sector.
The Seductive Appeal of CEO Heroes and Government Villains
Why, then, has the case become such a cause célèbre? Because contemporary political confrontations increasingly coalesce around hashtags, legal battles, and climactic court decisions. The clash of parties before a judge is far more entertaining and easy to follow than the byzantine bureaucratic politics that determine information flows among our myriad intelligence, law enforcement, and homeland security agencies. Given its repeated surveillance of First Amendment-protected activity, the FBI is hard to root for. Meanwhile, citizens may cheer Apple’s legal position here as enthusiastically as they embrace its devices. Exposed describes how passionate the devotion can become:
We strap on the [Apple Watch] monitoring device voluntarily, proudly. We show it off. Surely there is no longer any need for the state to force an ankle bracelet on us anymore when we so lustfully clasp this pulsing, slick, hard object on our own wrist. So joyfully, happily, willingly, lustfully, we wear it on our own body—the citizen’s second body lusciously wrapping itself around the first. The smart watch, in effect, has replaced the ankle bracelet.
Taking a side in the #ApplevFBI controversy can ease the cognitive dissonance attendant on clear recognition of the increasingly feudal social relations generated by both tech giants and the deep state. Trust the state to keep us free of terror, or trust the firm to keep your data secure—and then go back to tweeting, ‘gram-ing, snapping, commenting as usual.
A careful reading of Harcourt brings the dissonance back. Every solution brings new problems. Consider the lessons he draws from ankle bracelets, a “cheap on crime” solution to mass incarceration. Yes, if widely implemented as a perfected form of house arrest, they may keep tens of thousands out of jail. But there is an all-too-easy slide from using ankle bracelets to cheapen the cost of monitoring and controlling those who might have once been imprisoned, to using them (or similar technology) as all-purpose modes of identification, monitoring, and control of entire populations, regardless of adjudicated criminality. Harcourt is honest about how seamlessly technological dreams and nightmares mix: how the reduction of imprisonment may only come about via digitalization that effectively renders any part of our Deleuzian “society of control” a potentially imprisoning (or informing) impediment to action.
In #ApplevsFBI, a similarly complex dilemma is playing out. Suspicious of law enforcers, and wary of ordinary methods of holding them accountable, more Americans want their communications protected technologically. The “cryptographic commons” called for by Rogaway has failed to materialize, so they place their faith in a massive corporation. But who else is doing the same? What are the accountability mechanisms to keep Apple itself from betraying users? Are there forms of encryption that would render any such accountability mechanisms moot? As Jürgen Geuter asks, “How do we – as globally networked individuals living in digitally connected and mutually overlaying societies – define the relationship of transnational corporations and the rules and laws we created?”
There are agencies in the US government that support encryption technology, and those that attack it—and some do both. Large firms like Apple now see commercial advantage in fighting demands for decryption in the US—and readily acceding to similar demands for access in China. So long as our “surveillance capitalism’s profits derive primarily, if not entirely, from …markets for future behavior,” we can expect both firms and governments to play such double games: monitoring ever more of our lives while claiming to be protecting our privacy; bashing encryption in some contexts while hardening it in others. Eric Schmidt will chair DOD’s Defense Innovation Advisory Board, and the holding company of a firm “outraged” by DOD’s NSA, and the board of a foundation claiming to offer unbiased expertise on that very controversy.
This is a murky and dispiriting landscape. But it will not come as a surprise to a careful reader of Exposed. The book’s attention to legal and historical detail reminds me of Andrew Abbott’s characterization of the classic sociological theorists: “Marx, Durkheim, Weber [et al]. . . were up to their necks in empirical data: documents, surveys, statistics, and histories.” Harcourt’s work is a powerful reminder that the problem of privacy and data in the modern age is a bigger one than that of a “surveillance state,” or “Panopticons,” or “big brother.” Exposed offers a social theory that helps us recognize patterns of exchange between “big brother” and various “litte brothers;” to spot the PR-savvy tech firms whose personnel rotate in and out of the surveillance state. Neither Tim Cook nor James Comey can save us. Nor can we save ourselves, without the insight and nuance a book like Exposed provides.