Can a Market for Privacy Succeed?

Survselfhelplittle.jpgSiva Vaidhyanathan has fascinating essay on the nature of privacy that challenges the individualistic ethos of many policy recommendations in the field. He describes our eroding lack of control of reputation-affecting information as a “nonopticon:”

What we have at work in America today is the opposite of a Panopticon: what has been called a “Nonopticon” (for lack of a better word). The Nonopticon describes a state of being watched without knowing it, or at least the extent of it. . . .We don’t know all the ways we are being recorded or profiled. We are not supposed to understand that we are the product of marketers as much as we are the market. And we are not supposed to consider the extent to which the state tracks our behavior and considers us all suspects in crimes yet to be imagined, let alone committed.

Vaidhyanathan suggests that we can only attempt to modify the “nonopticon” collectively, not individually. This goes against the grain of much progressive privacy policy. Scared of slow-footed or Procrustean government regulation, many advocates have hoped that a combination of market pressures, tailored contracts, and technology could let individuals set their own “privacy preferences.” For example, “competes on privacy” with search engines like Google to give searchers a “right to delete” their digital dossiers. Many have expected that rational consumers will bargain in the marketplace for the privacy policies that best suit them.

I have long thought that such an approach all but guaranteed a ratcheting down of privacy protections. What better way to call attention to oneself than to be the one person bargaining for more privacy? Randy Picker’s blog has also brought to my attention the concept of “unraveling,” described as follows:

If I don’t smoke and if I went to buy life insurance tomorrow, I would want to disclose that fact to the insurance company. Insurance is priced based on a pool of risks, and as a nonsmoker, I want to be placed in a different pool than the smokers are in. But when I reveal that I am not a smoker, I set in motion a chain of inferences which should, on average, have the consequence of revealing that smokers are smokers, even if they never say anything. This is a standard result in information economics—we call it unraveling—and creates what we might think of as a privacy externality: when I reveal information about me, it has the consequence of revealing something about you.

Of course, as Picker notes, given the negative externalities of smoking, it may make perfect sense to encourage this sort of “unraveling.” But what happens when the unraveling (or, literarily, Unbinding) becomes more general? As Vaidhyanathan notes, self-help measures here may do little more than set off an arms race:

Every incentive in a market economy pushes companies to collect more and better data on us. Every incentive in a state bureaucracy encourages extensive surveillance. Only widespread political action can put a stop to it. Small changes, like better privacy policies by companies like Google and, are not going to make much difference in the long run, [Privacy scholar James A.] Rule argues. The challenge is too large and the risks too great. . . .”Self help” merely ratchets up the arms race of surveillance. Rule demands that we actively change the policies and actions of the state for the greater good.

I want to focus a bit more on the “self-help” point, and the types of sacrifices made to assure individual privacy preferences. I have been troubled by Google’s privacy practices for some time, but I am a heavy user of Gmail and cloud computing generally. It’s simply a matter of efficiency–if I were to use Lotus Notes (a default business email system), I’d be spending at least 10 more minutes a day just managing email. The Gmail interface is just that much better–especially in terms of searchability of messages.

Ten minutes may not sound like much time, but by the end of the week it’s over an hour of frustrating waiting avoided. So what happens when we find the efficiency and cost-savings of privacy-eroding innovations too great to ignore? Given the scarcity of time that nearly every professional faces nowadays, we are effectively required to adopt the privacy-eroding innovation.

Now perhaps the Gmail example will not convince those technically savvy enough to get all their communications dumped into a equally efficient email system. Still, consider the type of suspicions that might result if you were applying to a new job and said “By the way, in addition to requiring 2 weeks of vacation a year, I need to keep my email confidential.” The bargaining model is utterly inapt there. . . . just as it would have been for women to “bargain” for nondiscrimination policies, or mineworkers to bargain, one by one, for safety equipment.

As I wrote in another post, privacy might better be considered an “irreducibly social good” than some quantum of enjoyment individuals trade off for money. As Sunstein and Frank suggested in their work on CBA and relative position, given the importance of positional goods in today’s society, people who trade off safety/privacy/etc. will likely “outcompete” peers who won’t do so. They will have more money, and can thereby purchase better homes, send their children to better schools, afford better health insurance, and generally enjoy a higher standard of living than those who take the monetary and time-wasting sacrifices entailed by demanding greater privacy.

Though Sunstein and Frank were inspired by health and safety regulations, their work’s upshot applies equally well to privacy:

When a regulation requires all [individuals to purchase] additional [privacy], each . . . gives up the same amount of other goods, so no [one] experiences a decline in relative living standards. The upshot is that an individual will value an across-the-board increase in [privacy] much more highly than an increase in [privacy] that he alone purchases.

A collective commitment to privacy may be far more valuable than a private, transactional approach that all but guarantees a “race to the bottom.” The big question, of course, is whether such rules would effectively “cripple” innovative companies like Google and prevent the development of interfaces like Gmail. Here, I can only suggest that we watch what Europe does. For example, its privacy regulators led the way in requiring certain concessions of Google, and those do not appear to have crippled the company. My main point is just that to the extent we want privacy, it may be something that can only be achieved collectively–by its very nature it may well be a good that is impossible for the market to provide, because the very act of bargaining for it in some ways lessens its value (by suggesting, however errantly, that the bargainer has “something to hide.”).

Photo Credit: Glutnix.

You may also like...