Symposium on Configuring the Networked Self: Cohen’s Methodological Contributions

Julie Cohen’s extraordinarily illuminating book Configuring the Networked Self makes fundamental contributions to the field of law and technology. In this post, I’d like to focus on methodology and theory (a central concern of Chapters 1 to 4). In another post, I hope to turn to the question of realizing Cohen’s vision of human flourishing (a topic Chapters 9 and 10 address most directly).

Discussions of rights and utility dominate the intellectual property and privacy literatures.* Cohen argues that their appeal can be more rhetorical than substantive. As she has stated:

[T]he purported advantage of rights theories and economic theories is neither precisely that they are normative nor precisely that they are scientific, but that they do normative work in a scientific way. Their normative heft derives from a small number of formal principles and purports to concern questions that are a step or two removed from the particular question of policy to be decided. . . . These theories manifest a quasi-scientific neutrality as to copyright law that consists precisely in the high degree of abstraction with which they facilitate thinking about processes of cultural transmission.

Cohen notes “copyright scholars’ aversion to the complexities of cultural theory, which persistently violates those principles.” But she feels they should embrace it, given that it offers “account[s] of the nature and development of knowledge that [are] both far more robust and far more nuanced than anything that liberal political philosophy has to offer. . . . [particularly in understanding] how existing knowledge systems have evolved, and how they are encoded and enforced.”

A term like “knowledge system” may itself seem very abstract and formal. But Cohen’s work insists on a capacious view of network-enabled forms of knowing. Rather than naturalizing and accepting as given the limits of copyright and privacy law on the dissemination of knowledge, she can subsume them into a much broader framework of understanding where “knowing” is going. That framework includes cultural practices, norms, economics, and bureaucratic processes, as well as law.

We’ve seen that kind of ambition before, in Lawrence Lessig’s Code. But Cohen is not willing to accept its pathbreaking “modalities” approach to the shaping and control of human action. As stated in Chapter 7:

The four-part Code framework [of Lessig’s Code] cannot take us where we need to go. An account of regulation emerging from the Newtonian interaction of code, law, market, and norms [i.e., culture] is far too simple regarding both instrumentalities and effects. The architectures of control now coalescing around issues of copyright and security signal systemic realignments in the ordering of vast sectors of activity both inside and outside markets, in response to asserted needs that are both economic and societal.

What is happening beyond the Code framework? Aside from the theoretical rationales Cohen givens, two historical developments motivate a move beyond Lessig’s pre-millennial framework:

1. Cybersecurity concerns make flows of digital information ever more critical to national defense. These concerns were surfacing the 1990s, but took on far more urgency and importance after 9/11. Intelligence agencies and the newly formed Department of Homeland Security focused operations on monitoring threats to domestic order. The government massively upgraded its surveillance capabilities in the search for terrorists, then turned them on “all threats, all hazards.” DHS collaborated with local law enforcement officials and private critical infrastructure providers. Federal agencies gathered information in collaboration with state and local law enforcement officials in what Congress has deemed the “Information Sharing Environment” (ISE).

2. Network power has generated online entities whose “citizenries” are larger than all but the largest countries. Just as the ISE is developing for federal authorities, global finance is perfecting its own modes of monitoring companies and countries, and web giants are in a “data arms race” to perfect digital dossiers on users.

As Jaron Lanier has argued, both Silicon Valley and Wall Street have built enormously profitable business models as private spy agencies capable of leveraging information advantages. Danielle Citron has described three compelling examples of how the resulting technologies of surveillance mediate and limit individuals’ opportunities.

Dual Use Technologies

The same technologies deployed by DHS, Silicon Valley, and Wall Street can also empower internet users. Life online runs the gamut from frivolity to high public purpose. As Ethan Zuckerman observes, these high and low aims can be mutually reinforcing. A video like Collateral Murder can be spliced into MIA’s Vicki Leekz mixtape. A twitter community formed around cricket may turn to political activism, and vice versa. As images, music, and words get recopied, repurposed, and remixed, symbolic orders emerge undisciplined by the usual triple authority of church, state, and home.

As legal scholars, we’re conditioned to jump to the normative questions immediately, asking “is this a good thing?” It’s tempting to flee to free speech fundamentalism (“promiscuous publication and zero privacy, uber alles!”) or control fetishism (“lock down and propertize!”) in order to respond decisively to fast-paced events. Cohen insists that before we take any normative stance toward the blooming, buzzing confusion of internet life, we better understand it. Cultural theory is above all specific—to time, place, and people grappling with situated struggles.

While contemporary epigones of Kant and Mill battle it out over whether Verizon or Comcast’s property and free speech rights should trump those of their customers, Cohen’s approach demands we ask more questions. How invasive is the deep packet inspection that an ISP like Comcast wants to do? Why is it performing this surveillance? Who gets the data? What type of activity will be chilled by this intervention? Can users opt out, or is the fused public and private power here for all intents and purposes monopolistic? As she notes, “Some information policy problems cannot be solved simply by prescribing greater ‘openness’ or more ‘neutrality:'”

[R]ights of access to information and information networks do not necessarily correlate with rights to privacy; indeed, they more typically function in the opposite way. As network users become habituated to trading information for information and other services, access to goods and services takes place in an environment characterized by increasing amounts of both transparency and exposure. . . . [H]uman flourishing in the networked information society requires additional structural safeguards.

The lives of situated subjects are increasingly shaped by decisions made and implemented using networked information technologies. Those decisions present some possibilities and foreclose others. Most people have very little understanding of the ways that such decisions are made or of the options that are not presented. In many cases, this facial inaccessibility is reinforced by regimes of secrecy that limit even technically trained outsiders to “black box” testing. We would not tolerate comparable restrictions on access to the basic laws of physics, chemistry, or biology, which govern the operation of the physical environment. The algorithms and protocols that sort and categorize situated subjects, shape information flows, and authorize or deny access to network resources are the basic operational laws of the emerging networked information society; to exercise meaningful control over their surroundings, people need access to a baseline level of information about what those algorithms and protocols do.

Trying to theorize rights and utility claims in the absence of such information may be an exercise in futility. We can’t grasp the landscape without a map.

Some might question Cohen’s scientific metaphor here. Rather than accept the network as a feature of the world we should try to understand, they would assimilate it to the social processes that Friedrich Hayek mystifies** and Alan Greenspan calls “irredeemably opaque.” The boosters of “Big Data” predict an “End of Science,” where we pragmatically seek correlations rather than try to deeply understand underlying mechanisms and causes. But Cohen rightly insists that we can and should document and explain the historical evolution of our cyberinfrastructure. Neither sloppy Hayekianism nor an ontologically-abstinent philosophy of science can undermine that crucial claim. As Cohen notes in Chapter 1,

One cannot make sense of developments in either surveillance or network architecture more generally without interrogating the ways that information protocols and networked devices are reshaping our spaces and practices, encoding new path-dependencies and new habits of behavior. The credo that “code is law” recognizes that Internet technologies encode an especially powerful and peculiarly invisible form of behavioral discipline, but it does not acknowledge that these technologies also form the material substrate within which complex social patterns take root. Throughout the book, I will draw on the various literatures in [Science and Technology Studies] to explore the emergence of networked information architectures and associated social and institutional practices. One of the foundational texts in STS is Langdon Winner’s meditation on whether particular artifacts can be said to have a politics that has more definite consequences for the organization of society.

By deploying both STS and cultural studies modes of analysis, Cohen avoids the abstract legalism readily subverted by those who would keep the net opaque. By “abandon[ing] simplified theoretical constructs like ‘freedom of expression’ and ‘freedom of choice,’ and instead focus[ing] on the ordinary routines and rhythms of everyday practice,” she advances the anti-formalist agenda of her pathbreaking article Lochner in Cyberspace. As she develops Nussbaum’s capabilities theory for life online, she helps us understand which of these “ordinary routines and rhythms of everyday practice” deserve law’s solicitude, and which might be safely subject to the discipline of state and corporate power. This book helps us reimagine how to configure a network that promotes real human flourishing, rather than the increasingly commercialized and surveilled net we are now subject to.

*Things were a bit more open in the IP field in the 1980s. By the 90s and 00s, “normal science” in information policy centered on rights and utility claims. But, as Cohen observes, “there is evidence of a recent turn toward explicit adoption of the capabilities approach. Leading works include Yochai Benkler’s treatment of the linkages between information policy, information markets, and human freedom; Margaret Chon’s work on intellectual property and development; and Madhavi Sunder’s exploration of the intersections between intellectual property, the Internet protocol, and identity politics.” I hope that surveillance studies and critical internet studies can similarly influence privacy law.

**I can expand on this point in comments. I rely on Geoffrey Hodgson, ‘Hayek, Evolution, and Spontaneous Order’, in P. Mirowski (ed.), Natural Images in Economic Thought: Markets Read in Tooth and Claw, Cambridge University Press, Cambridge, 1994, pp. 408-47.

You may also like...