Opting Out Isn’t Socially Neutral Anymore

Various news outlets are reporting that Google “fans” in Germany have been egging the roughly 3% of houses whose inhabitants have chosen to opt out of Google’s Street View mapping feature.

Why?

Apparently the vandals left notes saying “Google is cool.”  So maybe this is just a “how dare you question anything Google?” protest.

But more likely it is something more. Jeff Jarvis on Buzz Machine recently labeled opting out of Street View the equivalent of “digital desecration,” saying that such “embarassing” “assaults” on the public “saddened and angered” him. It seems plausible that these vandals agree; “their” information has been taken from them by others asserting privacy interests, and they’re penalizing those who opt out for denying them what they view as theirs.

As Kashmir Hill put it on Forbes, “[i]t’s ironic that those who wanted more privacy through blurring their homes wound up getting less of it.”  Ironic, maybe. Surprising, not really. I’ve been making the argument here all month that opting out is not a privacy solution in many circumstances, because the act of opting out is itself visible to (and itself conveys information to) others. This is based on my forthcoming paper Unraveling Privacy; I’ve given examples from Mexico’s experiment with biometric retina recognition and from the quantified self / sensor movement. Egging is just a more crude version; in this example, it’s not that those who opted out are the “worst” members of a given pool (as in true unraveling scenarios) and are therefore discriminated against, but simply that others are pushing back against the right to opt out itself because it impedes unfettered access to all the information they want.

What’s next? If you won’t stream real-time data about your health (do you have the flu? other communicable diseases?) into your vicinity to warn others to walk on the other side of the street, will people heckle you? If you won’t display your criminal record prominently in digital form so that others can “see” (using their digital devices) whether you’re a sex offender or felon of some sort, will they assume you’re a criminal (unraveling) or harrass you for your “privacy” (like the German eggers)?  As I argue in Unraveling Privacy, the politics of privacy are getting more complicated; as some people increasingly share information about themselves, they will make attributions about those who do not — and potentially retaliate against them as well.

You may also like...

9 Responses

  1. Jeff Jarvis says:

    No, but imagine another hypothetical now: What if the accumulated data about activities a day before the outbreak of a condition (such as mine, atrial fibrillation) could lead to a better sense of causes and treatments? In that case, if someone with the condition doesn’t share his or her activities, is that more or less generous than the person who does? I’m not saying anyone should be forced to share, not at all. But I am saying that the social benefit of sharing is part of the equation.

  2. Scott Peppet says:

    I agree completely, Jeff. There’s no question that new technologies are making possible information-collection that was previously impossible or overly costly, and some of this is socially worthwhile. Real-time traffic mapping is a good example — if all cars broadcast their rate of speed to a network, the network can figure out where traffic exists and people can re-route. Your example is another good one — if lots of individuals capture and share information about their environment or persons (e.g, I’m having an allergic reaction / hay fever in GPS location XYZ today), that can save lives and create all sorts of conveniences.

    The egging incident is mostly humorous, but it does show that there are informal, social consequences to keeping information to oneself. People can penalize you in various ways. I don’t think it’s at all far-fetched to think that there will come a time — probably soon — when failure to broadcast health- or crime-related information is seen as anti-social. Whereas today the norm may be that it is the individual’s “right” to keep such information quiet, if not secret, once (a) sharing the information is low cost and technically easy; (b) a critical mass of people begin to do so; then (c) it will be difficult to be the hold out who doesn’t.

  3. Jeff Jarvis says:

    Well, I also think there’s a PR question. Some Germans I know were embarrassed culturally by their pixelating neighbors. I was embarrassed by the mania over the TSA this weekend. So overreaction breeds overreaction until some stasis is, we hope, found. That’s a very different calculation from the question of the generosity and sociability of sharing. Your traffic example is excellent; less loaded than health (though less valuable than health as well).

  4. A.J. Sutter says:

    I’m not sure that the conceptualization of “social benefit” in this thread so far is the only one possible. The approach taken seems to be very utilitarian, in that (i) privacy is seen as something benefiting an individual only, and (ii) the good of the many outweighs the good of an individual.

    Why not deem there to be a social benefit to a society that respects privacy of its citizens? The mutual respect accorded by citizens for each other’s privacy can be seen as a collective benefit, not an individual one. This non-utilitarian concept of reciprocity actually has a longer history than utilitarianism.

  5. Margaret Bartley says:

    People who tout the benefits of publicly-available data are assuming the entities that control and review that data are acting in our (the public’s) best interest.

    I would respectfully submit that is a leap of faith for the very naive or uninformed.

    We have seen time and again that information is NOT evenly shared with the public. In the example above, if we had real-time monitoring of public travel, I expect that it would not be long before the technology developed to broadcast skewed results such that a secret rapid route was kept open for the cognescenti.

    We’ve seen with HIPAA (the US medical privacy act) that it does not keep insurance companies, drug companies, police agencies, hospitals and their lawyers, or intelligence agencies from acquiring what ever data they want, whether it be on a specific person or on a class of people, but it does keep investigative reporters and citizen activists from being able to detect medical problems, toxic waste dump problems, adverse drug reactions, etc, because these citizens and investigators are not insiders.

    Information is power, and as Carroll Quigley pointed out, the amount of tyranny in a government is proportional to the discrepancy in power between the government (and corporations) and the people.

    The founders of the US did not trust Washington, Jefferson, and Franklin. How much less should we trust Bush, Obama, or Peloski?

  6. James Smith says:

    A.J. Sutter and Margaret Bartley make great points. Its not selfish to want to draw the line between your personal life and what aspects of your life corporations are allowed to profit off of. Really, its the people that harass, intimidate, and insult these people that are the problem. Its selfish to DEMAND that others conform with your beliefs. You cant justify intimidation as a means of eliminating others freedom of choice by claiming you are acting in “public” (personal) interest.

    I bet that if photographs were taken on an opt-in basis, none of these pro-google people wouldnt opt in in order to do good for society, but they might for something else in return. And if it were opt-in, you definitely wouldnt see people terrorized because of a decision they made whether to be made part of a service or not, even if things wound up reversed and those opting in appeared to be the minority instead.

    Maybe under some hypothetical situation a perfectly efficient and perfectly benevolent government could use photographs of peoples yards behind their fences or passwords lifted from your grandmothers unencrypted wireless router to promote peace and end poverty, but that doesnt mean we should let eager, novelty-obsessed individuals attack others so we can do away with walls and passwords. Even if one person among all the corrupt decided to do something good, the others would be motivated to stop anyone that stood in their way of personal gain at the expense of others.

  7. atanok says:

    Egging houses?

    How silly. They should’ve gotten high quality photos from every practical angle and upload to Panoramio with geographic information.

  8. prometheefeu says:

    I must say that the blurred houses were a poor example. The is functionally no privacy expectation in the outward appearance of your house. It is already public information which Google was simply aggregating. Your health is quite different as until it is revealed by you or your doctor, nobody can know it.

  9. Zachary Marco says:

    How true. As a law school graduate looking for a job, I was debating erasing my LinkedIn profile today, because I feel it does not represent me as I want it to. Yet, I wonder if not having one at all might cause more trepidation to potential employers than an inaccurate one.