FAN 200 (First Amendment News) Margot E. Kaminski, “The First Amendment and Data Privacy: Between Reed and a Hard Place”

Margot E. Kaminski is an associate professor of law at the University of Colorado Law School. She specializes in the law of new technologies, focusing on information governance, privacy, and freedom of expression. Her forthcoming work on transparency and accountability in the EU’s General Data Protection Regulation (GDPR) stems from her recent Fulbright-Schuman Innovation Grant in the Netherlands and Italy.


Professor Margot Kaminski

The Supreme Court’s recent Fourth Amendment cases show a strong awareness that privacy can implicate First Amendment values. In June 2018 in Carpenter v. United States, a case addressing warrantless location tracking through cell phone records, the majority noted that a lack of privacy can reveal (and presumably chill) “familial, political, professional, religious, and sexual associations.” In Riley v. California, a 2014 Fourth Amendment case addressing cell-phone searches, the majority recognized that while “[m]ost people cannot lug around every piece of mail they have received for the past several months, every picture they have taken, or every book or article they have read,” a cell phone can store all of these things. With these comments, the Court observed that free expression often relies on privacy, and implied that absent privacy protections, people may conform in their choice of reading material, their political affiliations, and ultimately, their speech. In other words, privacy protections often also protect First Amendment rights.

But at the same time, the Court’s recent First Amendment decisions have created additional obstacles for those who seek to draft an American data privacy law.

The United States famously does not have omnibus federal privacy protection. Instead, U.S. privacy law is a patchwork of sectoral protections (like protections for video records, consumer protection at the FTC, state privacy torts, and state AG enforcement). Legislators reading Carpenter may conclude that a number of Justices in that case (including Justice Samuel Alito, who explicitly calls for privacy lawmaking in his dissent) understand the need for omnibus data privacy law. But even as the Court in Carpenter seems to point to the need for privacy legislation, its First Amendment decisions in Reed v. Gilbert and NIFLA v. Becerra threaten to tie legislators’ hands.

Reed treats content-based regulation with suspicion; Becerra does the same with disclosure requirements. In Reed, which addressed a town’s rules for the placement of signs, the Court held that “regulation of speech is content based if a law applies to particular speech because of the topic discussed or the idea or message expressed.” All content-based regulation is subjected to strict scrutiny. Thus, a regulatory scheme that treated Political Signs differently from Temporary Directional Signs was content-based, and subject to strict scrutiny, and because it failed strict scrutiny, unconstitutional.

Becerra, decided this year, limits legislators’ ability to require truthful disclosures. The Court preliminarily enjoined California’s disclosure requirements for crisis pregnancy centers—centers that often pretend to provide abortion services but in practice discourage women from getting abortions. While claiming to be narrow and fact-bound, the majority in Becerra applied Reed’s broad understanding of content-based regulation to disclosure laws. The majority of the Court in Becerra explained that California’s disclosure law was “content-based regulation of speech” because “[b]y compelling individuals to speak a particular message, such notices ‘alte[r] the content of [their] speech.”

Why, in a discussion of data privacy, do I focus on Reed and Becerra and not on an earlier line of cases that directly address privacy laws? Because to an extent many Americans do not realize, data privacy protections are actually about increasing speech, not decreasing it. And at least as enacted elsewhere in the world, the efficacy of data privacy regimes as good policy often depends on being able to calibrate the law differently for different actors or scenarios. The first implicates Becerra on disclosures; the second implicates Reed and content-based analysis.

The Fair Information Practices, which were originally formulated in the United States, are the basis for data privacy laws around the world and are largely built around a concept that should be complimentary to the First Amendment: transparency. Take the EU’s General Data Protection Regulation (GDPR) as an example. Individuals are supposed to be notified when companies obtain their information. They have a right to access their data, and to find out to whom it has been disclosed. They have a right to find out where data has come from. Companies have to explain the purpose of data processing, and how profiling and automated decision-making work. All of these transparency rights and obligations attempt to correct, or at least expose, very real power imbalances between individuals and the companies that profit from their data. The GDPR is a disclosure law, as much as it is a right to stop other people from speaking about you.

Today’s paradoxical privacy problem, then, is that even as data privacy regimes rely in large part on increasing, not decreasing, speech by requiring disclosures to users, the Court’s recent First Amendment cases now shut down disclosure as a regulatory tool. Under Becerra’s reasoning, anydisclosure requirement could potentially be characterized as content-based (or, per Justice Stephen Breyer, “[v]irtually every disclosure law requires individuals ‘to speak a particular message’). The GDPR’s requirement that companies disclose the source of their data? Content-based compelled speech. The GDPR’s requirement that companies reveal to individuals the information held about them? A “particular message,” and thus content-based compelled speech.

The majority in Becerra attempts to cabin the impact of its opinion both (1) by pointing to the possibility of regulating speech incidental to regulated conduct (as it alleges was done by the majority in Planned Parenthood v. Casey, a case addressing compelled disclosures by doctors to patients seeking abortions), and (2) by carving out existing disclosure laws (“we do not question the legality of health and safety warnings long considered permissible, or purely factual and uncontroversial disclosures about commercial products”). The problem is that data privacy does not fit squarely within either of these potential exceptions. It regulates information flow, not conduct, or at least conduct that’s nearly inextricable from information flow (though I’ve argued elsewhere that some forms of privacy violations are actually framed in First Amendment law as conduct-like). And because the U.S. lacks omnibus data protection law, privacy doesn’t readily fall into the Court’s attempt to exempt existing consumer protection law. By virtue of its very newness, data privacy may be more heavily scrutinized than other accepted areas of consumer protection.

Justice Stephen Breyer (credit: The Nation)

As Justice  Breyer notes in his dissent, “in suggesting that heightened scrutiny applies to much economic and social legislation,” Becerra jeopardizes legislators’ judgments in areas long left to legislative discretion. Reedcompounds this problem.Some kinds of information, and some behaviors, create greater privacy harms than others. For example, the GDPR, like many American privacy laws, puts in place added protections for “special categories” of data—or what we would call “sensitive information.” Is this content-based discrimination? Does it apply “to particular speech because of the topic discussed?” If so, this would potentially implicate even our current sectoral approach to privacy, not to mention hundreds of behavior-or-information-type-specific state privacy laws. The GDPR also, in many places, distinguishes between categories of companies. Take, for example, the GDPR’s derogation for small and medium-sized enterprises, which are subject to less onerous record-keeping provisions, presumably because smaller companies pose less of a risk of inflicting privacy harms. A government may also want to create an exception to, or less onerous version of, privacy law for smaller companies as a matter of innovation or competition policy, to encourage the growth of startups. Under Reed —and its predecessor, Sorrell v. IMS — identifying particular topics or speakers, or categories of information flow, could give rise to a challenge of regulation as content-based or even viewpoint-based. On paper at least, as Justice Elena Kagan noted in her concurrence, Reed’s broad take on content-based regulation “cast[s] a constitutional pall on reasonable regulations” and puts in place judicial second-guessing of matters that legislatures are likely institutionally better situated to assess.

One potential loophole, or at least limiting principle, to explore is Justice Samuel Alito’s strangely confident conviction in his concurrence, joined by both Justice Sonia Sotomayor and Justice Anthony Kennedy, that “Rules imposing time restrictions on signs advertising a one-time event” would not be considered content-based. This suggests that it may be possible for legislators to continue to name things in information-related legislation, when the restriction is the kind of restriction (e.g. time place and manner) that the First Amendment allows. But how to line-draw between a law that imposes temporal restrictions on “signs advertising a one-time event” and a law that restricts, in other ways, “Temporary Direction Signs” is frankly beyond me.

Thus legislators wanting to write—or in the case of California, that have recently written and passed—data privacy law may find themselves stuck between Reed and a hard place. To some extent, this can be understood as one example of what some have described as the Lochnerization of the First Amendment: its use for deregulatory purposes. But in the context of privacy, things are perhaps uniquely complicated. Speech values fall squarely on both sides. By regulating speech to protect privacy, you both restrict and protect speech. As the Court noted in Bartnicki v. Vopper, “the fear of public disclosure of private conversations might well have a chilling effect on private speech. . . . In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively.” And as the Court has increasingly recognized in its Fourth Amendment jurisprudence, personal information beyond communicative content—such as location data, or reading material or pictures stored on a cell phone—can implicate First Amendment concerns as well, by revealing your associations, your political affiliations, your opinions, your innermost thoughts.

In some ways, Carpenter and other cases move the United States closer to Europe on privacy. There is increasing convergence on what counts as sensitive information: the GDPR includes location data in its definition of “personal data;” and the Court in both Jones and Carpenter recognized an expectation of privacy in publicly disclosed location information. The Court in Carpenter continued a recent theme in Fourth Amendment jurisprudence of referring to what might be understood as First Amendment harms; the GDPR, too, addresses speech-related privacy. Even more significantly, Carpenter begins to undermine a central premise of U.S. privacy law: that you don’t have an expectation of privacy in information you have shared. This suggests that privacy protections might travel with private information, and pop up later in information flows—in other words, that a data privacy model may now be more palatable in the United States. And a disclosure-based privacy law targeting third parties (data brokers) is exactly what California recently passed.

But the First Amendment, once again, may be the context that ultimately defines, through constraints, American privacy law. Determining how to navigate the roadblocks of the Court’s recent First Amendment jurisprudence may—even more than legislative inertia—be the central problem U.S. data privacy now faces.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

To prove you're a person (not a spam script), type the security word shown in the picture. Click on the picture to hear an audio file of the word.
Anti-spam image