Category: Privacy (Consumer Privacy)

0

FAN 200 (First Amendment News) Margot E. Kaminski, “The First Amendment and Data Privacy: Between Reed and a Hard Place”

Margot E. Kaminski is an associate professor of law at the University of Colorado Law School. She specializes in the law of new technologies, focusing on information governance, privacy, and freedom of expression. Her forthcoming work on transparency and accountability in the EU’s General Data Protection Regulation (GDPR) stems from her recent Fulbright-Schuman Innovation Grant in the Netherlands and Italy.

________________________________

Professor Margot Kaminski

The Supreme Court’s recent Fourth Amendment cases show a strong awareness that privacy can implicate First Amendment values. In June 2018 in Carpenter v. United States, a case addressing warrantless location tracking through cell phone records, the majority noted that a lack of privacy can reveal (and presumably chill) “familial, political, professional, religious, and sexual associations.” In Riley v. California, a 2014 Fourth Amendment case addressing cell-phone searches, the majority recognized that while “[m]ost people cannot lug around every piece of mail they have received for the past several months, every picture they have taken, or every book or article they have read,” a cell phone can store all of these things. With these comments, the Court observed that free expression often relies on privacy, and implied that absent privacy protections, people may conform in their choice of reading material, their political affiliations, and ultimately, their speech. In other words, privacy protections often also protect First Amendment rights.

But at the same time, the Court’s recent First Amendment decisions have created additional obstacles for those who seek to draft an American data privacy law.

The United States famously does not have omnibus federal privacy protection. Instead, U.S. privacy law is a patchwork of sectoral protections (like protections for video records, consumer protection at the FTC, state privacy torts, and state AG enforcement). Legislators reading Carpenter may conclude that a number of Justices in that case (including Justice Samuel Alito, who explicitly calls for privacy lawmaking in his dissent) understand the need for omnibus data privacy law. But even as the Court in Carpenter seems to point to the need for privacy legislation, its First Amendment decisions in Reed v. Gilbert and NIFLA v. Becerra threaten to tie legislators’ hands.

Reed treats content-based regulation with suspicion; Becerra does the same with disclosure requirements. In Reed, which addressed a town’s rules for the placement of signs, the Court held that “regulation of speech is content based if a law applies to particular speech because of the topic discussed or the idea or message expressed.” All content-based regulation is subjected to strict scrutiny. Thus, a regulatory scheme that treated Political Signs differently from Temporary Directional Signs was content-based, and subject to strict scrutiny, and because it failed strict scrutiny, unconstitutional.

Becerra, decided this year, limits legislators’ ability to require truthful disclosures. The Court preliminarily enjoined California’s disclosure requirements for crisis pregnancy centers—centers that often pretend to provide abortion services but in practice discourage women from getting abortions. While claiming to be narrow and fact-bound, the majority in Becerra applied Reed’s broad understanding of content-based regulation to disclosure laws. The majority of the Court in Becerra explained that California’s disclosure law was “content-based regulation of speech” because “[b]y compelling individuals to speak a particular message, such notices ‘alte[r] the content of [their] speech.”

Why, in a discussion of data privacy, do I focus on Reed and Becerra and not on an earlier line of cases that directly address privacy laws? Because to an extent many Americans do not realize, data privacy protections are actually about increasing speech, not decreasing it. And at least as enacted elsewhere in the world, the efficacy of data privacy regimes as good policy often depends on being able to calibrate the law differently for different actors or scenarios. The first implicates Becerra on disclosures; the second implicates Reed and content-based analysis.

The Fair Information Practices, which were originally formulated in the United States, are the basis for data privacy laws around the world and are largely built around a concept that should be complimentary to the First Amendment: transparency. Take the EU’s General Data Protection Regulation (GDPR) as an example. Individuals are supposed to be notified when companies obtain their information. They have a right to access their data, and to find out to whom it has been disclosed. They have a right to find out where data has come from. Companies have to explain the purpose of data processing, and how profiling and automated decision-making work. All of these transparency rights and obligations attempt to correct, or at least expose, very real power imbalances between individuals and the companies that profit from their data. The GDPR is a disclosure law, as much as it is a right to stop other people from speaking about you.

Today’s paradoxical privacy problem, then, is that even as data privacy regimes rely in large part on increasing, not decreasing, speech by requiring disclosures to users, the Court’s recent First Amendment cases now shut down disclosure as a regulatory tool. Under Becerra’s reasoning, anydisclosure requirement could potentially be characterized as content-based (or, per Justice Stephen Breyer, “[v]irtually every disclosure law requires individuals ‘to speak a particular message’). The GDPR’s requirement that companies disclose the source of their data? Content-based compelled speech. The GDPR’s requirement that companies reveal to individuals the information held about them? A “particular message,” and thus content-based compelled speech.

The majority in Becerra attempts to cabin the impact of its opinion both (1) by pointing to the possibility of regulating speech incidental to regulated conduct (as it alleges was done by the majority in Planned Parenthood v. Casey, a case addressing compelled disclosures by doctors to patients seeking abortions), and (2) by carving out existing disclosure laws (“we do not question the legality of health and safety warnings long considered permissible, or purely factual and uncontroversial disclosures about commercial products”). The problem is that data privacy does not fit squarely within either of these potential exceptions. It regulates information flow, not conduct, or at least conduct that’s nearly inextricable from information flow (though I’ve argued elsewhere that some forms of privacy violations are actually framed in First Amendment law as conduct-like). And because the U.S. lacks omnibus data protection law, privacy doesn’t readily fall into the Court’s attempt to exempt existing consumer protection law. By virtue of its very newness, data privacy may be more heavily scrutinized than other accepted areas of consumer protection.

Justice Stephen Breyer (credit: The Nation)

As Justice  Breyer notes in his dissent, “in suggesting that heightened scrutiny applies to much economic and social legislation,” Becerra jeopardizes legislators’ judgments in areas long left to legislative discretion. Reedcompounds this problem.Some kinds of information, and some behaviors, create greater privacy harms than others. For example, the GDPR, like many American privacy laws, puts in place added protections for “special categories” of data—or what we would call “sensitive information.” Is this content-based discrimination? Does it apply “to particular speech because of the topic discussed?” If so, this would potentially implicate even our current sectoral approach to privacy, not to mention hundreds of behavior-or-information-type-specific state privacy laws. The GDPR also, in many places, distinguishes between categories of companies. Take, for example, the GDPR’s derogation for small and medium-sized enterprises, which are subject to less onerous record-keeping provisions, presumably because smaller companies pose less of a risk of inflicting privacy harms. A government may also want to create an exception to, or less onerous version of, privacy law for smaller companies as a matter of innovation or competition policy, to encourage the growth of startups. Under Reed —and its predecessor, Sorrell v. IMS — identifying particular topics or speakers, or categories of information flow, could give rise to a challenge of regulation as content-based or even viewpoint-based. On paper at least, as Justice Elena Kagan noted in her concurrence, Reed’s broad take on content-based regulation “cast[s] a constitutional pall on reasonable regulations” and puts in place judicial second-guessing of matters that legislatures are likely institutionally better situated to assess.

One potential loophole, or at least limiting principle, to explore is Justice Samuel Alito’s strangely confident conviction in his concurrence, joined by both Justice Sonia Sotomayor and Justice Anthony Kennedy, that “Rules imposing time restrictions on signs advertising a one-time event” would not be considered content-based. This suggests that it may be possible for legislators to continue to name things in information-related legislation, when the restriction is the kind of restriction (e.g. time place and manner) that the First Amendment allows. But how to line-draw between a law that imposes temporal restrictions on “signs advertising a one-time event” and a law that restricts, in other ways, “Temporary Direction Signs” is frankly beyond me.

Thus legislators wanting to write—or in the case of California, that have recently written and passed—data privacy law may find themselves stuck between Reed and a hard place. To some extent, this can be understood as one example of what some have described as the Lochnerization of the First Amendment: its use for deregulatory purposes. But in the context of privacy, things are perhaps uniquely complicated. Speech values fall squarely on both sides. By regulating speech to protect privacy, you both restrict and protect speech. As the Court noted in Bartnicki v. Vopper, “the fear of public disclosure of private conversations might well have a chilling effect on private speech. . . . In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively.” And as the Court has increasingly recognized in its Fourth Amendment jurisprudence, personal information beyond communicative content—such as location data, or reading material or pictures stored on a cell phone—can implicate First Amendment concerns as well, by revealing your associations, your political affiliations, your opinions, your innermost thoughts.

In some ways, Carpenter and other cases move the United States closer to Europe on privacy. There is increasing convergence on what counts as sensitive information: the GDPR includes location data in its definition of “personal data;” and the Court in both Jones and Carpenter recognized an expectation of privacy in publicly disclosed location information. The Court in Carpenter continued a recent theme in Fourth Amendment jurisprudence of referring to what might be understood as First Amendment harms; the GDPR, too, addresses speech-related privacy. Even more significantly, Carpenter begins to undermine a central premise of U.S. privacy law: that you don’t have an expectation of privacy in information you have shared. This suggests that privacy protections might travel with private information, and pop up later in information flows—in other words, that a data privacy model may now be more palatable in the United States. And a disclosure-based privacy law targeting third parties (data brokers) is exactly what California recently passed.

But the First Amendment, once again, may be the context that ultimately defines, through constraints, American privacy law. Determining how to navigate the roadblocks of the Court’s recent First Amendment jurisprudence may—even more than legislative inertia—be the central problem U.S. data privacy now faces.

0

The Fragility of Desire

In his excellent new book Exposed, Harcourt’s analysis of the role of desire in what he calls the “expository society” of the digital age is seductive. We are not characters in Orwell’s 1984, or prisoners of Bentham’s Panopticon, but rather are enthusiastic participants in a “mirrored glass pavilion” that is addictive and mesmerizing. Harcourt offers a detailed picture of this pavilion and also shows us the seamy side of our addiction to it. Recovery from this addiction, he argues, requires acts of disobedience but there lies the great dilemma and paradox of our age: revolution requires desire, not duty, but our desires are what have ensnared us.

I think that this is both a welcome contribution as well as a misleading diagnosis.

There have been many critiques of consent-based privacy regimes as enabling, rather than protecting, privacy. The underlying tenor of many of these critiques is that consent fails as a regulatory tool because it is too difficult to make it truly informed consent. Harcourt’s emphasis on desire shows why there is a deeper problem than this, that our participation in the platforms that surveil us is rooted in something deeper than misinformed choice. And this makes the “what to do?” question all the more difficult to answer. Even for those of us who see a stronger role for law than Harcourt outlines in this book (I agree with Ann Bartow’s comments on this) should pause here. Canada, for example, has strong private sector data protections laws with oversight from excellent provincial and federal privacy commissioners. And yet these laws are heavily consent-based. Such laws are able to shift practices to a stronger emphasis on things like opt-in consent, but Harcourt leaves us with a disquieting sense that this might just be just an example of a Pyrrhic victory, legitimizing surveillance through our attempts to regulate it because we still have not grappled with the more basic problem of the seduction of the mirrored glass pavilion.

The problem with Harcourt’s position is that, in exposing this aspect of the digital age in order to complicate our standard surveillance tropes, he risks ignoring other sources of complexity that are also important for both diagnosing the problem and outlining a path forward.

Desire is not always the reason that people participate in new technologies. As Ann Bartow and Olivier Sylvain point out, people do not always have a choice about their participation in the technologies that track us. The digital age is not an amusement park we can choose to go to or to boycott, but deeply integrated into our daily practices and needs, including the ways in which we work, bank, and access government services.

But even when we do actively choose to use these tools, it is not clear that desire captures the why of all such choices. If we willingly enter Harcourt’s mirrored glass pavilion, it is sometimes because of some of its very useful properties — the space- and time-bending nature of information technology. For example, Google calendar is incredibly convenient because multiple people can access shared calendars from multiple devices in multiple locations at different times making the coordination of calendars incredibly easy. This is not digital lust, but digital convenience.

These space- and time-bending properties of information technology are important for understanding the contours of the public/private nexus of surveillance that so characterizes our age. Harcourt does an excellent job at pointing out some of the salient features of this nexus, describing a “tentacular oligarchy” where private and public institutions are bound together in state-like “knots of power,” with individuals passing back and forth between these institutions. But what is strange in Harcourt’s account is that this tentacular oligarchy still appears to be bounded by the political borders of the US. It is within those borders that the state and the private sector have collapsed together.

What this account misses is the fact that information technology has helped to unleash a global private sector that is not bounded by state borders. In this emerging global private sector large multinational corporations often operate as “metanationals” or stateless entities. The commercial logic of information is that it should cross political borders with ease and be stored wherever it makes the most economic sense.

Consider some of the rhetoric surrounding the e-commerce chapter of the recent TPP agreement. The Office of the US Trade Representative indicates that one of its objectives is to keep the Internet “free and open” which it has pursued through rules that favour cross-border data flows and prevent data localization. It is easy to see how this idea of “free” might be confused with political freedom, for an activist in an oppressive regime is better off in exercising freedom of speech when that speech can cross political borders or the details of their communications can be stored in a location that is free of the reach of their state. A similar rationale has been offered by some in the current Apple encryption debate — encryption protects American business people communicating within China and we can see why that is important.

But this idea of freedom is the freedom of a participant in a global private sector with weak state control; freedom from the state control of oppressive regimes also involves freedom from the state protection of democratic regimes.

If metanationals pursue a state-free agenda, the state pursues an agenda of rights-protectionism. By rights protectionism I mean the claim that states do, and should, protect the constitutional rights of their own citizens and residents but not others. Consider, for example, a Canadian citizen who resides in Canada and uses US cloud computing. That person could be communicating entirely with other Canadians in other Canadian cities and yet have all of their data stored in the US-based cloud. If the US authorities wanted access to that data, the US constitution would not apply to regulate that access in a rights-protecting manner because the Canadian is a non-US person.

Many see result as flowing from the logic of the Verdugo-Urquidez case. Yet that case concerned a search that occurred in a foreign territory (Mexico), rather than within the US, where the law of that territory continued to apply. The Canadian constitution does not apply to acts of officials within the US. The data at issue falls into a constitutional black hole where no constitution applies (and maybe even international human rights black hole according to some US interpretations of extraterritorial obligations). States can then collect information within this black hole free of the usual liberal-democratic constraints and share it with other allies, a situation Snowden documented within the EU and likened to a “European bazaar” of surveillance.

Rights protectionism is not rights protection when information freely crosses political boundaries and state power piggybacks on top of this crossing and exploits it.

This is not a tentacular oligarchy operating within the boundaries of one state, but a series of global alliances – between allied states and between states and metanationals who exert state-like power — exploiting the weaknesses of state-bound law.

We are not in this situation simply because of a penchant for selfies. But to understand the full picture we do need to look beyond “ourselves” and get the global picture in view. We need to understand the ways in which our legal models fail to address these new realities and even help to mask and legitimize the problems of the digital age through tools and rhetoric that are no longer suitable.

Lisa Austin is an Associate Professor at the University of Toronto Faculty of Law.

0

Surveillance and Our Addiction to Exposure

Bernard Harcourt ExposedBernard Harcourt’s Exposed: Desire and Disobedience in the Digital Age (Harvard University Press 2015) is an indictment of  our contemporary age of surveillance and exposure — what Harcourt calls “the expository society.” Harcourt passionately deconstructs modern technology-infused society and explains its dark implications with an almost poetic eloquence.

Harcourt begins by critiquing the metaphor of George Orwell’s 1984 to describe the ills of our world today.  In my own previous work, I critiqued this metaphor, arguing that Kafka’s The Trial was a more apt metaphor to capture the powerlessness and vulnerability that people experience as government and businesses construct and use “digital dossiers” about their lives.  Harcourt critiques Orwell in a different manner, arguing that Orwell’s dystopian vision is inapt because it is too drab and gray:

No, we do not live in a drab Orwellian world.  We live in a beautiful, colorful, stimulating, digital world that is online, plugged in, wired, and Wi-Fi enabled.  A rich, bright, vibrant world full of passion and jouissance–and by means of which we reveal ourselves and make ourselves virtually transparent to surveillance.  In the end, Orwell’s novel is indeed prescient in many ways, but jarringly off on this one key point.  (pp. 52-53)

City wet-868078_960_720 pixabay b

Orwell’s Vision

City NYC new-117018_960_720 pixabay b

Life Today

Neil Postman Amusing Ourselves to DeathHarcourt notes that the “technologies that end up facilitating surveillance are the very technologies we crave.”  We desire them, but “we have become, slowly but surely, enslaved to them.” (p. 52).

Harcourt’s book reminds me of Neil Postman’s Amusing Ourselves to Death, originally published about 30 years ago — back in 1985.  Postman also critiqued Orwell’s metaphor and argued that Aldous Huxley’s Brave New World was a more apt metaphor to capture the problematic effects new media technologies were having on society.

Read More

0

The 5 Things Every Privacy Lawyer Needs to Know about the FTC: An Interview with Chris Hoofnagle

The Federal Trade Commission (FTC) has become the leading federal agency to regulate privacy and data security. The scope of its power is vast – it covers the majority of commercial activity – and it has been enforcing these issues for decades. An FTC civil investigative demand (CID) will send shivers down the spine of even the largest of companies, as the FTC requires a 20-year period of assessments to settle the score.

To many, the FTC remains opaque and somewhat enigmatic. The reason, ironically, might not be because there is too little information about the FTC but because there is so much. The FTC has been around for 100 years!

In a landmark new book, Professor Chris Hoofnagle of Berkeley Law School synthesizes an enormous volume of information about the FTC and sheds tremendous light on the FTC’s privacy activities. His book is called Federal Trade Commission Privacy Law and Policy (Cambridge University Press, Feb. 2016).

This is a book that all privacy and cybersecurity lawyers should have on their shelves. The book is the most comprehensive scholarly discussion of the FTC’s activities in these areas, and it also delves deep in the FTC’s history and activities in other areas to provide much-needed context to understand how it functions and reasons in privacy and security cases.

Read More

1

5 Great Novels About Privacy and Security

I am a lover of literature (I teach a class in law and literature), and I also love privacy and security, so I thought I’d list some of my favorite novels about privacy and security.

I’m also trying to compile a more comprehensive list of literary works about privacy and security, and I welcome your suggestions.

Without further ado, my list:

Franz Kafka, The Trial

Kafka’s The Trial begins with a man being arrested but not told why. In typical Kafka fashion, the novel begins badly for the protagonist . . . and then it gets worse! A clandestine court system has compiled a dossier about him and officials are making decisions about him, but he is left in the dark. This is akin to how Big Data can operate today. The Trial captures the sense of helplessness, frustration, and powerlessness when large institutions with inscrutable purposes use personal data and deny people the right to participate. I wrote more extensively about how Kafka is an apt metaphor for privacy in our times in a book called The Digital Person about 10 years ago.

Franz Kafka The Trial

 

Read More

The Black Box Society: Interviews

My book, The Black Box Society, is finally out! In addition to the interview Lawrence Joseph conducted in the fall, I’ve been fortunate to complete some radio and magazine interviews on the book. They include:

New Books in Law

Stanford Center for Internet & Society: Hearsay Culture

Canadian Broadcasting Corporation: The Spark

Texas Public Radio: The Source

WNYC: Brian Lehrer Show.

Fleishman-Hillard’s True.

I hope to be back to posting soon, on some of the constitutional and politico-economic themes in the book.

1

Should the FTC Be Regulating Privacy and Data Security?

This post was co-authored with Professor Woodrow Hartzog.

This past Tuesday the Federal Trade Commission (FTC) filed a complaint against AT&T for allegedly throttling the Internet of its customers even though they paid for unlimited data plans. This complaint was surprising for many, who thought the Federal Communications Commission (FCC) was the agency that handled such telecommunications issues. Is the FTC supposed to be involved here?

This is a question that has recently been posed in the privacy and data security arenas, where the FTC has been involved since the late 1990s. Today, the FTC is the most active federal agency enforcing privacy and data security, and it has the broadest reach. Its fingers seem to be everywhere, in all industries, even those regulated by other agencies, such as in the AT&T case. Is the FTC going too far? Is it even the FTC’s role to police privacy and data security?

The Fount of FTC Authority

The FTC’s source of authority for privacy and data security comes from some specific statutes that give the FTC regulatory power. Examples include the Children’s Online Privacy Protection Act (COPPA) where the FTC regulates online websites collecting data about children under 13 and the Gramm-Leach-Bliley Act (GLBA) which governs financial institutions.

But the biggest source of the FTC’s authority comes from Section 5 of the FTC Act, where the FTC can regulate “unfair or deceptive acts or practices in or affecting commerce.” This is how the FTC has achieved its dominant position.

Enter the Drama

Until recently, the FTC built its privacy and security platform with little pushback. All of the complaints brought by the FTC for unfair data security practices quickly settled. However, recently, two companies have put on their armor, drawn their swords, and raised the battle cry. Wyndham Hotels and LabMD have challenged the FTC’s authority to regulate data security. These are more than just case-specific challenges that the FTC got the facts wrong or that the FTC is wrong about certain data security practices. Instead, these challenges go to whether the FTC should be regulating data security under Section 5 in the first place. And the logic of these challenges could also potentially extend to privacy as well.

The first dispute involving Wyndham Hotels has already resulted in a district court opinion affirming the FTC’s data protection jurisprudence. The second dispute over FTC regulatory authority involving LabMD is awaiting trial.

In the LabMD case, LabMD is contending that the U.S. Department of Health and Human Services (HHS) — not the FTC — has the authority to regulate data security practices affecting patient data regulated by HIPAA.

With Wyndham, and especially LabMD, the drama surrounding the FTC’s activities in data protection has gone from 2 to 11. The LabMD case has involved the probable shuttering of business, a controversial commissioner recusal, a defamation lawsuit, a House Oversight committee investigation into the FTC’s actions, and an entire book written by the LabMD’s CEO chronicling his view of the conflict. And the case hasn’t even been tried yet!

The FTC Becomes a Centenarian

And so, it couldn’t be more appropriate that this year, the FTC celebrates its 100th birthday.

To commemorate the event, the George Washington Law Review is hosting a symposium titled “The FTC at 100: Centennial Commemorations and Proposals for Progress,” which will be held on Saturday, November 8, 2014, in Washington, DC.

The lineup for this event is really terrific, including U.S. Supreme Court Justice Steven Breyer, FTC Chairwoman Edith Ramirez, FTC Commissioner Joshua Wright, FTC Commissioner Maureen Ohlhausen, as well as many former FTC officials.

FTC 03 GW

Some of the participating professors include Richard Pierce, William Kovacic, David Vladeck, Howard Beales, Timothy Muris, and Tim Wu, just to name a few.

At the event, we will be presenting our forthcoming article:

The Scope and Potential of FTC Data Protection
83 George Washington Law Review (forthcoming 2015)

So Is the FTC Overreaching?

Short answer: No. In our paper, The Scope and Potential of FTC Data Protection, we argue that the FTC not only has the authority to regulate data protection to the extent it has been doing, but it also has the authority to expand its reach much more. Here are some of our key points:

* The FTC has a lot of power. Congress gave the FTC very broad and general regulatory authority by design to allow for a more nimble and evolutionary approach to the regulation of consumer protection.

* Overlap in agency authority is inevitable. The FTC’s regulation of data protection will inevitably overlap with other agencies and state law given the very broad jurisdiction in Section 5, which spans nearly all industries. If the FTC’s Section 5 power were to stop at any overlapping regulatory domain, the result would be a confusing, contentious, and unworkable regulatory system with boundaries constantly in dispute.

* The FTC’s use of a “reasonable” standard for data security is quite reasonable. Critics of the FTC have attacked its data security jurisprudence as being too vague and open-ended; the FTC should create a specific list of requirements. However, there is a benefit to mandating reasonable data security instead of a specific, itemized checklist. When determining what is reasonable, the FTC has often looked to industry standards. Such an approach allows for greater flexibility in the face of technological change than a set of rigid rules.

* The FTC performs an essential role in US data protection. The FTC’s current scope of data protection authority is essential to the United States data protection regime and should be fully embraced. The FTC’s regulation of data protection gives the U.S. system of privacy law needed legitimacy and heft. Without the FTC’s data protection enforcement authority, the E.U. Safe Harbor agreement and other arrangements that govern the international exchange of personal information would be in jeopardy. The FTC can also harmonize discordant privacy-related laws and obviate the need for new laws.

* Contrary to the critics, the FTC has used its powers very conservatively. Thus far, the FTC has been quite modest in its enforcement, focusing on the most egregious offenders and enforcing the most widespread industry norms. The FTC should push the development of the norms a little more (though not in an extreme or aggressive way).

* The FTC can and should expand its enforcement, and there are areas in need of improvement. The FTC now sits atop an impressive body of jurisprudence. We applaud its efforts and believe it can and should do even more. But as it grows into this role of being the data protection authority for the United States, some gaps in its power need to be addressed and it can improve its processes and transparency.

The FTC currently plays the role as the primary regulator of privacy and data security in the United States. It reached this position in part because Congress never enacted comprehensive privacy regulation and because some kind of regulator was greatly needed to fill the void. The FTC has done a lot so far, and we believe it can and should do more.

If you want more detail, please see our paper, The Scope and Potential of FTC Data Protection. And with all the drama about the FTC these days, please contact us if you want to option the movie rights.

Cross-posted on LinkedIn

Reining in the Data Brokers

I’ve been alarmed by data brokers’ ever-expanding troves of personal information for some time. My book outlines the problem, explaining how misuse of data undermines equal opportunity. I think extant legal approaches–focusing on notice and consent–put too much of a burden on consumers. This NYT opinion piece sketches an alternate approach:

[D]ata miners, brokers and resellers have now taken creepy classification to a whole new level. They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed.

***

Privacy protections in other areas of the law can and should be extended to cover consumer data. The Health Insurance Portability and Accountability Act, or Hipaa, obliges doctors and hospitals to give patients access to their records. The Fair Credit Reporting Act gives loan and job applicants, among others, a right to access, correct and annotate files maintained by credit reporting agencies.

It is time to modernize these laws by applying them to all companies that peddle sensitive personal information. If the laws cover only a narrow range of entities, they may as well be dead letters. For example, protections in Hipaa don’t govern the “health profiles” that are compiled and traded by data brokers, which can learn a great deal about our health even without access to medical records.

There’s more online, but given the space constraints, I couldn’t go into all the details that the book discloses. I hope everyone enjoys the opinion piece, and that it whets appetites for the book!

2

Advice on How to Enter the Privacy Profession

Over at LinkedIn, I have a long post with advice for how law students can enter into the privacy profession.   I hope that this post can serve as a useful guide to students who want to pursue careers in privacy.

The privacy law field is growing dramatically, and demand for privacy lawyers is high.  I think that many in the academy who don’t follow privacy law, cyberlaw, or law and technology might not realize what’s going on in the field.  The field is booming.

The International Association of Privacy Professionals (IAPP), the field’s primary association, has been growing by about 30% each year.  It now has more than 17,000 members.  And this is only a subset of privacy professionals, as many privacy officials in healthcare aren’t members of IAPP and instead are members of the American Health Information Management Association (AHIMA) or the Health Care Compliance Association (HCCA).

There remains a bottleneck at the entry point to the field, but that can be overcome.  Once in the club, the opportunities are plentiful and there’s the ability to rise quickly.   I’ve been trying to push for solutions to make entry into the field easier, and this is an ongoing project of mine.

If you have students who are interested in entering the privacy law profession, please share my post with them.  I hope it will help.

Interview on The Black Box Society

BBSBalkinization just published an interview on my forthcoming book, The Black Box Society. Law profs may be interested in our dialogue on methodology—particularly, what the unique role of the legal scholar is in the midst of increasing academic specialization. I’ve tried to surface several strands of inspiration for the book.