Innovative Architectures of Privacy
As Daniel J. Weitzner recently noted to the New York Times, our current notice-and-choice model of privacy may soon be dead and good riddance. Since the 1990s, we have relied upon websites’ privacy policies to inform individuals about whether their information would be collected, used, and shared. Consumers usually don’t read these policies and, if they did, they likely would not understand them. This leaves us with with much room to do better.
In “Redrawing the Route to Online Privacy,” the New York Times discusses how law and technology might get help us out of this mess. The article highlighted several intriguing technical innovations. A group at Carnegie Mellon University has designed software that will nudge consumers about the privacy implications of sharing certain information. As CMU’s Lorrie Faith Cranor explains, social network site users often share their birth dates, hoping to receive online greetings from friends yet doing so runs the risk of marketing profiling, identification, and identity theft. Software could inform consumers of these risks before they share their birth dates. M. Ryan Calo, a fellow at Stanford Law School’s Center for Internet and Society who has done exciting work on the privacy implications of robots, is exploring voice and animation technology emulating humans that would provide “visceral notice.” Before someone puts information in a personal health record like GoogleHealth, a virtual nurse could explain the privacy implications of sharing the information. Calo explains that people naturally react more strongly, in a more visceral way, to anthropomorphic cues. The think tank Future of Privacy led by Jules Polonetsky and Chris Wolff is testing the effectiveness of using new icons and key phrases to provide web surfers with more transparency and choice about behavioral advertising practices. Princeton’s Ed Felten (whose important computer science research has rightly preoccupied government and industry) is working on re-engineering the Web browser for greater privacy. Felten would alter the software’s design so that information about on-screen viewing sessions is kept separate and not routinely passed along so a person’s browsing behavior can be tracked.
As these efforts make clear, code is crucial to the protection of consumer privacy. To what extent, if at all, should we invoke law to regulate websites’ information practices? Congress and the Federal Trade Commission is mulling rules that would limit a site’s use of information collected online. As the New York Times notes, government might ban the use of recorded trails of a person’s web-browsing in employment or health insurance decisions. It would be worth considering limits on data collection and retention practices too. Law could require the deletion of certain information after a certain time, in the manner suggested by Viktor Mayer-Schonberger’s work. All worth pondering.