Surveillance, Capture, and the Endless Replay

Global opposition to surveillance may be coalescing around the NSA revelations. But the domestic fusion centers ought to be as big a story here in the US, because they exemplify politicized law enforcement. Consider, for instance, this recent story on the “threat” of “Buy Nothing Day:”

Fusion Centers and their personnel even conflate their anti-terrorism mission with a need for intelligence gathering on a possible consumer boycott during the holiday season. There are multiple documents from across the country referencing concerns about negative impacts on retail sales.

The Executive Director of the Intelligence Fusion Division, also the Joint Terrorism Task Force Director, for the D.C. Metropolitan Police Department circulated a 30-page report tracking the Occupy Movement in towns and cities across the country created by the trade association the International Council of Shopping Centers (ICSC).

Yes, police were briefed on the grave threat of fake shoppers bringing lots of products to the till and then pretending they’d forgotten their wallets. Perhaps the long game here is to detain members of the Church of Stop Shopping to force them to make Elves on the Shelf for $1 an hour.

More seriously: no one should be surprised by the classification of anti-consumerist activists as a threat, given what Danielle Keats Citron & I documented, and what the ACLU continues to report on. But we do need more surprising, more arresting, characterizations of this surveillance. Fortunately, social theory provides numerous models and metaphors to counter the ideology of “nothing to hide.”

One critical thinker here is William Bogard, who authored the penetrating book The Simulation of Surveillance: Hypercontrol in Telematic Societies. The simultaneous neologism and archaism of “telematic” suggests a startling premise of the book: that surveillance is meant just as much to control the future as it is to record the past. We are surrounded by systems of prediction and control. The supervision here is not simply a way of stopping particularly bad acts, but of shaping behavior toward certain ends. The better the surveillance gets, the better the “men behind the camera” can plan, behavioristically, matrices of penalties and rewards to reinforce acceptable behavior and deter terror, crime, anti-social behavior, suspicious activities, lack of productivity, laziness—because, really, it’s all part of a continuum, right?

You may think: “wait–how is the ‘Church of Stop Shopping’ a national security threat?” There’s an economic answer: namely, that any defense advantage the US has over other countries is epiphenomenal of taxation of a vast and growing national economy. Anti-consumerism undermines economic growth and, indirectly, military might. But Kate Crawford captures the larger, cultural dynamic:

If we take [the] twinned anxieties — those of the surveillers and the surveilled — and push them to their natural extension, we reach an epistemological end point: on one hand, the fear that there can never be enough data, and on the other, the fear that one is standing out in the data. These fears reinforce each other in a feedback loop, becoming stronger with each turn of the ratchet. As people seek more ways to blend in — be it through normcore dressing or hardcore encryption — more intrusive data collection techniques are developed. And yet, this is in many ways the expected conclusion of big data’s neopositivist worldview. As historians of science Lorraine Daston and Peter Galison once wrote, all epistemology begins in fear — fear that the world cannot be threaded by reason, fear that memory fades, fear that authority will not be enough.

How intrusive will the data collection get? Try technology that “promises to catch in the act anyone who tries to fake a given emotion or feeling.” It’s just a repurposing of voice-parsing algos to faces. Capitalist competition means that marketers can’t ignore this edge. Neither can the Secret Service, as it desperately seeks a “sarcasm detector” to isolate true threats.

All this surveillance can be used to very good ends. But it should be obvious that its malignant forms are endangering creativity, dissent, and complex thinking. Stray too far from the binary of Democratic & Republican politics, and you risk the watchlist. Protest shopping on Black Friday, and you risk the watchlist. Take a different route to work on a given day, and maybe that’ll flag you (“what is she trying to avoid?”). Read the wrong blogs or tweets, and an algorithm like Squeaky Dolphin is keeping a record.

Law enforcement surveillance is not just a camera, but an engine, driving society in a certain direction. It is not a mirror of our nature, but a modulating source of selves. What defense analysts characterize as dissent risk (or banks see as “Vox Populi Risk“) can easily expand to include the very foundations of self-governance. We cannot let it continue to scrutinize dissent, deviance, or disagreement wholly disconnected from lawbreaking. The longer the fusion center apparatus sweeps ordinary citizens into its dragnet, the more we risk freezing into place a future that rigidly reenacts the past, as dividuals find replicating the captured patterns of past behavior the only safe way to avoid future suspicion, stigma, and detention.

Frank Pasquale

Frank is Professor of Law at the University of Maryland. His research agenda focuses on challenges posed to information law by rapidly changing technology, particularly in the health care, internet, and finance industries.

Frank accepts comments via email, at All comments emailed to may be posted here (in whole or in part), with or without attribution, either as "Dissents of the Day" or as parts of follow-up post(s). Please indicate in your comment whether or not you would like attribution, or would prefer your comment (if it is selected for posting) to be anonymous.

You may also like...