DRM for Privacy: Part 1

Online privacy has been getting quite a bit of attention of late.  But the problem seems as intractable as ever.  In a pair of posts, I will explore one aspect of the online privacy debate and, drawing from a controversial corner of copyright law, suggest a modest fix.  This first post discusses the problem of consumer tracking and the lack of any good solutions.  You may want to skip this post if you are familiar with the online privacy ecosystem (and uninterested in correcting my oversimplifications and mistakes).  The next post discusses how an often criticized provision of the Digital Millennium Copyright Act—the anti-circumvention clause—might hold lessons for consumer privacy.   This provision prohibits tampering with so-called digital rights management.  The law has its problems as a mechanism to enforce copyright.  As applied to consumers’ efforts to protect their privacy, however, a few of Section 1201’s bugs metamorphose into features. 

Online advertising has at least two advantages over offline.1  The first is that it is easier for advertisers to measure their ad campaigns.  They do so by monitoring the frequency with which consumers click on the ads they see as well as the percentage of consumers who click on the ads that ultimately buy a product or sign up for a service (together, their “click stream”).  The second advantage of online ads is that they can be tailored to the individual.  There are a number of processes by which an online advertiser might target an ad to a web user.  For instance, it could target the ad based on context, that is, base it on the search the consumer has just made or the content of the page the consumer happens to be viewing.

One of the more lucrative forms of targeting is known as behavioral or online preference marketing.  An advertising network engaged in OPM uses one of several techniques—placing one or more tracking files called cookies on the consumer’s computer, for instance—to follow people as they travel from website to website.  Based on where the consumer goes, the advertising network creates a profile—or set of guesses—about the individual and uses it to target ads.  Thus, for instance, if you visit Diapers.com and Etsy.com, the network may guess that you are a liberal parent and serve you ads for Wholefoods on an entirely different website.2

For years, few consumers understood that this sort of targeting was taking place.  There was also very little regulatory pressure on online advertisers at either the state or federal level.  This began to change about ten years ago when, around 2000 and again six years later, we saw a wave of scrutiny over the practice from some officials and journalists.

For a variety of reasons, however, including a fear of disrupting the vibrant and lucrative Internet ecosystem, the movement did not get much traction.  Various fixes were proposed—among them, the Do Not Track idea you see in the headlines today—but they collapsed under the weight of confusion, lack of expertise among regulators, and counterargument.  The online advertising industry came away with a self-regulatory regime consisting of essentially two affirmative obligations: the commitment to describe their practices in a privacy policy that consumers would encounter on the websites they visit and an opportunity for consumers to opt out of tracking.  Some consumers did so; others used technologies—settings or features offered by their browser, browser plug-ins, or other software—to try to protect their privacy.  Many did neither.

Enter the Wall Street Journal’s What They Know series.  This and other sustained coverage by news outlets, as well as the work of various researchers and non-profits—not to mention the increased relevance of the Internet in the lives of Americans—finally made privacy a dinner table conversation.  Unlike five and ten years ago, the most recent criticisms of the privacy practices of various Internet companies have been highly influential.  The result has been serious attention on the Hill, which held hearings and is looking into various legislative actions; a highly-motivated Federal Trade Commission; new interest from the Department of Commerce and Federal Communications Commission; and an ongoing series of class action lawsuits.

When the floodlights were turned on and the magnifying glass came out, advertisers were essentially caught with their hands in the cookie jar.  It was ugly, in my estimation.  You saw reports showing, for instance, that websites aimed at children not only had multiple tracking elements, but that children were among the most tracked of any Internet population despite a specific kids’ privacy law.  There was also evidence that some online advertisers were combining web-surfing behavior with social networking information or even offline buying behavior.

Perhaps the worst set of revelations was that the advertising industry had evolved its tracking techniques to defeat efforts by the consumer not to be tracked.  Thus, for instance, research out of Carnegie Mellon and UC Berkeley showed that advertisers were “respawning” cookies consumers had deleted.  An affiliate of the Center where I work revealed that one advertising company was using a technique called history sniffing to figure out what websites consumers had visited without need for cookies.3  Somewhere along the way it came out that opting out of tracking did not necessarily stop the collection of information, only the service of tailored ads, and that some companies simply removed an opt out after a few days (leading to an FTC enforcement action).

And yet, with all of the attention and evidence, there remains no clear solution to the problem of privacy in online advertising.  Consumers and online advertisers are locked in the same, lop-sided arms race.   Consumers are not meaningfully informed of company practices and, accordingly, cannot be said truly to consent to tracking.  For the remainder of this post, I will briefly describe some of the proposed or existing mechanisms to help protect consumers along with some of their chief limitations.

Privacy Policies

The Idea: California law requires that operators of commercial websites that collect personally identifiable information display a link to a privacy policy.  The policy must describe the type of information collected, how it is used, with whom it is shared, and whether steps are taken to safeguard it.4  The hope is that informed consumers are in a position to protect themselves and police the market by carefully selecting among available services.

The Problem: Next to no one reads privacy polices and fewer understand them.  Or they are written at such a level of generality (e.g., “we collect your information to deliver you better ads”) that they lack meaning.  I think information strategies can work as a regulatory mechanism in privacy and elsewhere and argue so at length in a forthcoming law review article.  But I agree with critics that privacy policies in their current form do very little to assuage privacy concerns, evidence consent, or curtail the behavior of online advertisers.

Privacy Seals

The Idea: Neutral third-parties will monitor the practices of online advertisers and other websites and certify that they are privacy friendly, so that consumers do not have to read privacy polices or take other steps to investigate.

The Problem: How do we know the privacy seal providers are doing their job?  A 2009 study of companies certified by the leading seal provider found that the companies were, if anything, less likely to be trustworthy.  The FTC recently settled a complaint against a privacy and security seal provider alleged not to do the testing it claimed.   One is reminded of David Hume’s argument that it is impossible to verify what our senses tells us because the verification of any one sense requires the use of another.

Do Not Track

The Idea: Browsers will permit consumers to transmit a clear signal that they do not want to be tracked.  Online advertising companies will respond by stopping certain (or all) tracking of that consumer.  A new platform, FourthParty, developed at Stanford helps the technologically savvy determine what companies are doing in response to the request to stop tracking.

The Problem: Compliance with DNT is not mandatory.  A company can ignore it.  Or they can choose to interpret it somehow to opt the consumer out of ad targeting, not tracking.  A second problem—one I suspect is holding up legislative requirements around DNT—has to do with scope.  What does the consumer opt out of, exactly, when she opts out of tracking?  Does she opt out of tracking for the purpose of determining how many people have visited the website or for the purpose of ensuring that companies are not using up their competitors’ ad spend through click fraud?  These questions need to be addressed, either through definitional work or technological alternatives.5

In short, many people today are aware of tracking for advertising and object to it.  They take steps to protect their privacy online.  But advertisers are able to ignore and end run these preferences and some do so.  The solutions on the table do not work in their present form because they are impracticable, hard to police, overbroad, or lack an adequate enforcement mechanism.  Obviously I have oversimplified aspects of this complex issue due to space constraints, but I hope to have relayed a basic description of online advertising and its discontents.   In my next post, I will explore how an unpopular provision of a copyright law might be a decent model to protect consumers.

Thanks to Chris Soghoian, Jonathan Mayer, and especially Ashkan Soltani for their help on some of the examples and details.

__________________________

[1] Offline advertisers are beginning to catch up.

[2] Techniques get much more specific. You might visit Weather.com and be informed that there is a high pollen count, for instance, only to be served an add for Benadryl while reading the New York Times online. (These are just examples.)

[3] You know how when you have clicked on a link it changes color? The history sniffing technique shows your browser a big list of websites, including potentially several sensitive ones, and then asks the browser what color the link should be. In this way, the company can figure out if you have been to the website without necessarily setting a cookie.

[4] No web company wants to exclude California from its user base and so the California Online Privacy Protection Act of 2003 is, for practical purposes, the law of the land. Even were it not, the Federal Trade Commission might intercede under Section V of the FTC Act if a web company did not describe any of its practices in a privacy policy.  The Commission has not done so to date but has certainly brought actions for deception when a company said one thing in a policy only to do another.

[5] One variation on DNT comes from Microsoft. It is called Tracking Protection Lists and, not only does it send out a signal that the consumer does not want to be tracked, it actually blocks certain domains from tracking. It’s central problem is that it is labor intensive to create a TPL and few organizations have endeavored to do so (though a few have). And, when they do, you run into the same problem of privacy seals: how do you know that the TPL is not just a list of favored companies?

You may also like...

2 Responses

  1. PrometheeFeu says:

    Well, there is another option: consumers while they might prefer to be tracked less are relatively accepting of the current level of tracking as can be seen by their continuous adoption of online services which track them and there is no action necessary by the state.

    Full disclosure, I work for a company who does do some tracking.

  2. AreYouKiddingMe says:

    Is PrometheeFeu kidding? Consumers also “accept” the current level of spam. They “accept” the current level of fraud. Consumer “accept” a lot of things that they can’t control or, in the case of tracking, don’t know anything about. I invite PrometheeFeu to post details about tracking on his company’s website in a visible manner and see what happens.