Category: Privacy

The Emerging Law of Algorithms, Robots, and Predictive Analytics

In 1897, Holmes famously pronounced, “For the rational study of the law the blackletter man may be the man of the present, but the man of the future is the man of statistics and the master of economics.” He could scarcely envision at the time the rise of cost-benefit analysis, and comparative devaluation of legal process and non-economic values, in the administrative state. Nor could he have foreseen the surveillance-driven tools of today’s predictive policing and homeland security apparatus. Nevertheless, I think Holmes’s empiricism and pragmatism still animate dominant legal responses to new technologies. Three conferences this Spring show the importance of “statistics and economics” in future tools of social order, and the fundamental public values that must constrain those tools.

Tyranny of the Algorithm? Predictive Analytics and Human Rights

As the conference call states

Advances in information and communications technology and the “datafication” of broadening fields of human endeavor are generating unparalleled quantities and kinds of data about individual and group behavior, much of which is now being deployed to assess risk by governments worldwide. For example, law enforcement personnel are expected to prevent terrorism through data-informed policing aimed at curbing extremism before it expresses itself as violence. And police are deployed to predicted “hot spots” based on data related to past crime. Judges are turning to data-driven metrics to help them assess the risk that an individual will act violently and should be detained before trial. 


Where some analysts celebrate these developments as advancing “evidence-based” policing and objective decision-making, others decry the discriminatory impact of reliance on data sets tainted by disproportionate policing in communities of color. Still others insist on a bright line between policing for community safety in countries with democratic traditions and credible institutions, and policing for social control in authoritarian settings. The 2016 annual conference will . . . consider the human rights implications of the varied uses of predictive analytics by state actors. As a core part of this endeavor, the conference will examine—and seek to advance—the capacity of human rights practitioners to access, evaluate, and challenge risk assessments made through predictive analytics by governments worldwide. 

This focus on the violence targeted and legitimated by algorithmic tools is a welcome chance to discuss the future of law enforcement. As Dan McQuillan has argued, these “crime-fighting” tools are both logical extensions of extant technologies of ranking, sorting, and evaluating, and raise fundamental challenges to the rule of law: 

According to Agamben, the signature of a state of exception is ‘force-of’; actions that have the force of law even when not of the law. Software is being used to predict which people on parole or probation are most likely to commit murder or other crimes. The algorithms developed by university researchers uses a dataset of 60,000 crimes and some dozens of variables about the individuals to help determine how much supervision the parolees should have. While having discriminatory potential, this algorithm is being invoked within a legal context. 

[T]he steep rise in the rate of drone attacks during the Obama administration has been ascribed to the algorithmic identification of ‘risky subjects’ via the disposition matrix. According to interviews with US national security officials the disposition matrix contains the names of terrorism suspects arrayed against other factors derived from data in ‘a single, continually evolving database in which biographies, locations, known associates and affiliated organizations are all catalogued.’ Seen through the lens of states of exception, we cannot assume that the impact of algorithmic force-of will be constrained because we do not live in a dictatorship. . . .What we need to be alert for, according to Agamben, is not a confusion of legislative and executive powers but separation of law and force of law. . . [P]redictive algorithms increasingly manifest as a force-of which cannot be restrained by invoking privacy or data protection. 

The ultimate logic of the algorithmic state of exception may be a homeland of “smart cities,” and force projection against an external world divided into “kill boxes.” 


We Robot 2016: Conference on Legal and Policy Issues Relating to Robotics

As the “kill box” example suggests above, software is not just an important tool for humans planning interventions. It is also animating features of our environment, ranging from drones to vending machines. Ryan Calo has argued that the increasing role of robotics in our lives merits “systematic changes to law, institutions, and the legal academy,” and has proposed a Federal Robotics Commission. (I hope it gets further than proposals for a Federal Search Commission have so far!)


Calo, Michael Froomkin, and other luminaries of robotics law will be at We Robot 2016 this April at the University of Miami. Panels like “Will #BlackLivesMatter to RoboCop?” and “How to Engage the Public on the Ethics and Governance of Lethal Autonomous Weapons” raise fascinating, difficult issues for the future management of violence, power, and force.


Unlocking the Black Box: The Promise and Limits of Algorithmic Accountability in the Professions


Finally, I want to highlight a conference I am co-organizing with Valerie Belair-Gagnon and Caitlin Petre at the Yale ISP. As Jack Balkin observed in his response to Calo’s “Robotics and the Lessons of Cyberlaw,” technology concerns not only “the relationship of persons to things but rather the social relationships between people that are mediated by things.” Social relationships are also mediated by professionals: doctors and nurses in the medical field, journalists in the media, attorneys in disputes and transactions.


For many techno-utopians, the professions are quaint, an organizational form to be flattened by the rapid advance of software. But if there is anything the examples above (and my book) illustrate, it is the repeated, even disastrous failures of many computational systems to respect basic norms of due process, anti-discrimination, transparency, and accountability. These systems need professional guidance as much as professionals need these systems. We will explore how professionals–both within and outside the technology sector–can contribute to a community of inquiry devoted to accountability as a principle of research, investigation, and action. 


Some may claim that software-driven business and government practices are too complex to regulate. Others will question the value of the professions in responding to this technological change. I hope that the three conferences discussed above will help assuage those concerns, continuing the dialogue started at NYU in 2013 about “accountable algorithms,” and building new communities of inquiry. 


And one final reflection on Holmes: the repetition of “man” in his quote above should not go unremarked. Nicole Dewandre has observed the following regarding modern concerns about life online: 

To some extent, the fears of men in a hyperconnected era reflect all-too-familiar experiences of women. Being objects of surveillance and control, exhausting laboring without rewards and being lost through the holes of the meritocracy net, being constrained in a specular posture of other’s deeds: all these stances have been the fate of women’s lives for centuries, if not millennia. What men fear from the State or from “Big (br)Other”, they have experienced with men. So, welcome to world of women….

Dewandre’s voice complements that of US scholars (like Danielle Citron and Mary Ann Franks) on systematic disadvantages to women posed by opaque or distant technological infrastructure. I think one of the many valuable goals of the conferences above will be to promote truly inclusive technologies, permeable to input from all of society, not just top investors and managers.

X-Posted: Balkinization.

0

Holiday Cheer – Creations for Good

A sister notices that her sister’s monitor for her blood sugar level has a weak alarm and does not work well to wake someone up at night, when the alert is critical. Sister decides maybe she can do something, and she does. Who is this mystery girl? Our own Danielle Citron shared with me (and let me share more) that her daughter, JJ, has been designing a new monitor to help diabetics (which her sister has).

JJ applied to a program to help high schoolers with STEM projects and was paired with folks at Northrup Grumman where she spent a day a month developing her idea. Along the way, JJ had to figure out what alarm noise worked best to wake someone up, program a code to link the monitor and bracelet devices, and then wired them. As her school reports

This year, Citron will continue to test and refine the design, creating the bracelet with the help of a 3D printer. When she’s finished, the bracelet will change color to let the user know immediately if their blood sugar is getting too high or too low. The detailed information from the monitor will also be linked to a smartphone app.

3D printing! Color coding! And JJ seems poised to go into computer science.

Although I am friends with Dani and have met JJ, the real point for me is that a teenager saw a problem and felt she had the room to try and fix it. Then she worked on it. Her success is lovely, but the fact of the chance is downright excellent and puts me in a great holiday mood. Of course, with Danielle as her mom, JJ may have to look forward to law professors wondering about patents, privacy, and data ownership, but those are what a good friend of mine once called “high quality problems.” Well done, JJ.

0

MLAT – Not a Muscle Group Nonetheless Potentially Powerful

MLAT. I encountered this somewhat obscure thing (Mutual Legal Assistance Treaty) when I was in practice and needed to serve someone in Europe. I recall it was a cumbersome process and thinking that I was happy we did not seem to have to use it often (in fact the one time). Today, however, as my colleagues Peter Swire and Justin Hemmings argue in their paper, Stakeholders in Reform of the Global System for Mutual Legal Assistance, the MLAT process is quite important.

In simplest terms, if a criminal investigation in say France needs an email and it is stored in the U.S.A., the French authorities ask the U.S. ones for aid. If the U.S. agency that processes the request agrees there is a legal basis for the request, it and other groups seek a court order. If that is granted, the order would be presented to the company. Once records are obtained, there is further review to ensure “compliance U.S. law.” Then the records would go to France. As Swire and Hemmings note, the process averages 10 months. For a civil case that is long, but for criminal cases that is not workable. And as the authors put it, “the once-unusual need for an MLAT request becomes routine for records that are stored in the cloud and are encrypted in transit.”

Believe it or not, this issue touches on major Internet governance issues. The slowness and the new needs are fueling calls for having the ITU govern the Internet and access to evidence issues (a model according to the paper favored by Russia and others). Simpler but important ideas such as increased calls for data localization also flow from the difficulties the paper identifies. As the paper details, the players–non-U.S. governments, the U.S. government, tech companies, and civil society groups–each have goals and perspectives on the issue.

So for those interested in Internet governance, privacy, law enforcement, and multi-stakeholder processes, the MLAT process and this paper on it offer a great high-level view of the many factors at play in those issues for both a specific topic and larger, related ones as well.

0

A Darn Good Read – Paul Schwartz on Data Processing

Almost twenty-five years ago, Paul Schwartz wrote Data Processing and Government Administration: The Failure of the American Legal Response to the Computer, 43 HASTINGS L.J. 1321 (1991) (pdf), and I must say it is worth a read today. Paul identified the problems with government’s and especially the administrative state’s use of computation to do their duties. As he opened:

Computers are now an integral part of government administration. They put a tremendous amount of personal data in the hands of government officials, who base a wide range of decisions on this information. Yet the attention paid to the government’s use of data processing has not been equal to the potential dangers that this application presents. Personal information, when disclosed to family and friends, helps form the basis of trust; in the hands of strangers, this information can have a corrosive effect on individual autonomy. The human race’s rapid development of computer technology has not been matched by a requisite growth in the ability to control these new machines.

That passage may seem familiar, but recall when it was written and note the next point Paul made:

This Article’s goal is to formulate a constructive response to computer processing of personal data. The destruction of computers is no more an answer to informatization than the destruction of earlier machines would have been an answer to industrialization. Accordingly, this Article seeks to understand the results of the government’s processing of
personal data and to develop appropriate legal principles to guide this application of computer technology.

That goal seems to be missing in some discussions, but I think it is a good one. To be clear, I don’t necessarily agree with some of Paul’s prescriptions. But the point of this post is not about that. I recommend the paper despite disagreeing with some of the ideas. I do so because it helped me with the history of the topic, explained issues, presented a structure and jurisprudence to drill into the topic, offered ways to address them, and pushed me to think more on my views. It is a well-written, worthwhile read both for substance and style.

In short, thank you Professor Schwartz.

0

China, the Internet, and Sovereignty

China’s World Internet Conference is, according to its organizers, about:

“An Interconnected World Shared and Governed by All—Building a Cyberspace Community of Shared Destiny”. This year’s Conference will further facilitate strategic-level discussions on global Internet governance, cyber security, the Internet industry as the engine of economic growth and social development, technological innovation and philosophy of the Internet. It is expected that 1200 leading figures from governments, international organizations, enterprises, science & technology communities, and civil societies all around the world will participate the Conference.

As the Economist points out, “The grand title is misleading: the gathering will not celebrate the joys of a borderless internet but promote “internet sovereignty”, a web made up of sovereign fiefs, gagged by official censors. Political leaders attending are from such bastions of freedom as Russia, Pakistan, Kazakhstan, Kyrgyzstan and Tajikistan.”

One of the great things about being at GA Tech is the community of scholars from a wide range of backgrounds. This year colleagues in Public Policy hired Milton Mueller, a leader in telecommunication and Internet policy. I have known his work for some time, but it has been great getting to hang out and talk with Milton. Not surprising, but Milton has a take on the idea of sovereignty and the Internet. I can’t share it, as it is in the works. But as a teaser, keep your eye out for it.

As a general matter, it seems to me that sovereignty will be a keyword in coming Internet governance debates across all sectors. Whether the term works from a political science perspective or others should be interesting. Thinking of jurisdiction, privacy, surveillance, telecommunication, cyberwar, and intellectual property, I can see sovereignty being asserted, perverted, and converted to serve a range of interests. Revisiting the core international relations theories to be clear about what sovereignty is and should be seems a good project for a law scholar or student as these areas evolve.

A Review of The Black Box Society

I just learned of this very insightful and generous review of my book, by Raizel Liebler:

The Black Box Society: The Secret Algorithms that Control Money and Information (Harvard University Press 2015) is an important book, not only for those interested in privacy and data, but also anyone with larger concerns about the growing tension between transparency and trade secrets, and the deceptiveness of pulling information from the ostensibly objective “Big Data.” . . .

One of the most important aspects of The Black Box Society builds on the work of Siva Vaidhyanathan and others to write about how relying on the algorithms of search impact people’s lives. Through our inability to see how Google, Facebook, Twitter, and other companies display information, it makes it seem like these displays are in some way “objective.” But they are not. Between various stories about blocking pictures of breastfeeding moms, blocking links to competing sites, obscurity sources, and not creating tools to prevent harassment, companies are making choices. As Pasquale puts it: “at what point does a platform have to start taking responsibility for what its algorithms go, and how their results are used? These new technologies affect not only how we are understood, but also how we understand. Shouldn’t we know when they’re working for us, against us, or for unseen interests with undisclosed motives?”

I was honored to be mentioned on the TLF blog–a highly recommended venue! Here’s a list of some other reviews in English (I have yet to compile the ones in other languages, but was very happy to see the French edition get some attention earlier this Fall). And here’s an interesting take on one of those oft-black-boxed systems: Google Maps.

0

How CalECPA Improves on its Federal Namesake

Last week, Governor Brown signed the landmark California Electronic Communications Privacy Act[1] (CalECPA) into law and updated California privacy law for modern communications. Compared to ECPA, CalECPA requires warrants, which are more restricted, for more investigations; provides more notice to targets; and furnishes as a remedy both court-ordered data deletion and statutory suppression.  Moreover, CalECPA’s approach is comprehensive and uniform, eschewing the often irrational distinctions that have made ECPA one of the most confusing and under-protective privacy statutes in the Internet era.

Extended Scope, Enhanced Protections, and Simplified Provisions

CalECPA regulates investigative methods that ECPA did not anticipate. Under CalECPA, government entities in California must obtain a warrant based on probable cause before they may access electronic communications contents and metadata from service providers or from devices.  ECPA makes no mention of device-stored data, even though law enforcement agents increasingly use StingRays to obtain information directly from cell phones. CalECPA subjects such techniques to its warrant requirement. While the Supreme Court’s recent decision in United States v. Riley required that agents either obtain a warrant or rely on an exception to the warrant requirement to search a cell phone incident to arrest, CalECPA requires a warrant for physical access to any device, not just a cell phone, which “stores, generates, or transmits electronic information in electronic form.” CalECPA clearly defines the exceptions to the warrant requirement by specifying what counts as an emergency, who can give consent to the search of a device, and related questions.

ECPA’s 1986-drafted text only arguably covers the compelled disclosure of location data stored by a service provider, and does not clearly require a warrant for such investigations. CalECPA explicitly includes location data in the “electronic communication information” that is subject to the warrant requirement when a government entity accesses it from either a device or a service provider (broadly defined).  ECPA makes no mention of location data gathered in real-time or prospectively, but CalECPA requires a warrant both for those investigations and for stored data investigations. Whenever a government entity compels the “the production of or access to” location information, including GPS data, from a service provider or from a device, CalECPA requires a warrant.

Read More

Air Traffic Control for Drones

8435473266_16e7ae4191_zRecently a man was arrested and jailed for a night after shooting a drone that hovered over his property. The man felt he was entitled (perhaps under peeping tom statutes?) to privacy from the (presumably camera-equipped) drone. Froomkin & Colangelo have outlined a more expansive theory of self-help:

[I]t is common for new technology to be seen as risky and dangerous, and until proven otherwise drones are no exception. At least initially, violent self-help will seem, and often may be, reasonable even when the privacy threat is not great – or even extant. We therefore suggest measures to reduce uncertainties about robots, ranging from forbidding weaponized robots to requiring lights, and other markings that would announce a robot’s capabilities, and RFID chips and serial numbers that would uniquely identify the robot’s owner.

On the other hand, the Fortune article reports:

In the view of drone lawyer Brendan Schulman and robotics law professor, Ryan Calo, home owners can’t just start shooting when they see a drone over their house. The reason is because the law frowns on self-help when a person can just call the police instead. This means that Meredith may not have been defending his house, but instead engaging in criminal acts and property damage for which he could have to pay.

I am wondering how we might develop a regulatory infrastructure to make either the self-help or police-help responses more tractable. Present resources seem inadequate. I don’t think the police would take me seriously if I reported a drone buzzing my windows in Baltimore—they have bigger problems to deal with. If I were to shoot it, it might fall on someone walking on the sidewalk below. And it appears deeply unwise to try to grab it to inspect its serial number.

Following on work on license plates for drones, I think that we need to create a monitoring infrastructure to promote efficient and strict enforcement of law here. Bloomberg reports that “At least 14 companies, including Google, Amazon, Verizon and Harris, have signed agreements with NASA to help devise the first air-traffic system to coordinate small, low-altitude drones, which the agency calls the Unmanned Aerial System Traffic Management.” I hope all drones are part of such a system, that they must be identifiable as to owner, and that they can be diverted into custody by responsible authorities once a credible report of lawbreaking has occurred.

I know that this sort of regulatory vision is subject to capture. There is already misuse of state-level drone regulation to curtail investigative reporting on abusive agricultural practices. But in a “free-for-all” environment, the most powerful entities may more effectively create technology to capture drones than they deploy lobbyists to capture legislators. I know that is a judgment call, and others will differ. I also have some hope that courts will strike down laws against using drones for reporting of matters of public interest, on First Amendment/free expression grounds.

The larger point is: we may well be at the cusp of a “this changes everything” moment with drones. Illah Reza Nourbakhsh’s book Robot Futures imagines the baleful consequences of modern cities saturated with butterfly-like drones, carrying either ads or products. Grégoire Chamayou’s A Theory of the Drone presents a darker vision, of omniveillance (and, eventually, forms of omnipotence, at least with respect to less technologically advanced persons) enabled by such machines. The present regulatory agenda needs to become more ambitious, since “black boxed” drone ownership and control creates a genuine Ring of Gyges problem.

Image Credit: Outtacontext.

1

When Love’s Promises Are Fulfilled By the U.S. Supreme Court

Today, in a 5-4 decision, the United States Supreme recognized the fundamental nature of love’s promises. In Obergefell et al. v. Hodges, the Court held,  “the Fourteenth Amendment requires a State to license a marriage between two people of the same sex and to recognize a marriage between two people of the same sex when their marriage was lawfully licensed and performed out-of-State.”  Referring to marriage as a “keystone” of the U.S.’s “social order,” Justice Kennedy declared same-sex marriage bans unconstitutional. Importantly, the case makes clear that forcing gay couples to go across state lines to marry only to deny them the franchise after returning home undermines fundamental principles of liberty.

It’s no surprise that Professor Martha Ertman’s powerful book: Love’s Promises: How Formal and Informal Contracts Shape All Kinds of Families on which she copiously and beautifully toiled while rearing her son debuts the summer that equality in marriage becomes a fundamental right for gay men and women. Nor should anyone be surprised if the book, along with the decision itself, becomes a central text at universities and beyond. In what David Corn calls a “love letter to marriage,” from the pen of Justice Kennedy, the Court reasoned:

“No union is more profound than marriage, for it embodies the highest ideals of love, fidelity, devotion, sacrifice, and family. In forming a marital union, two people become something greater than once they were. As some of the petitioners in these cases demonstrate, marriage embodies a love that may endure even past death. It would misunderstand these men and women to say they disrespect the idea of marriage. Their plea is that they do respect it, respect it so deeply that they seek to find its fulfillment for themselves. Their hope is not to be condemned to live in loneliness, excluded from one of civilization’s oldest institutions. They ask for equal dignity in the eyes of the law. The Constitution grants them that right.“

With that, the Supreme Court overruled the prior judgement of the Court of Appeals for the Sixth Circuit and set in gear the reversal of centuries’ worth of stigma, shame and inequality, which may not erase overnight, but overtime will ease. Professor Ertman might also suggest that by the decision, the Court resituates contracts too. That is to say, if viewed from the lens of contracts, which serves as the core, theoretical foundation of Love’s Promises, this decision recognizes a fundamental right in contract for gay men and women. Further, the case expands the “contract” franchise to include gay women and men.

Some scholars approach gay marriage primarily from the constitutional liberties encapsulated in the 14th Amendment, upholding equal protection for U.S. citizens regardless of their status, others approach the issue as a matter of privacy. For Professor Ertman, contracts offer an additional lens and much to deliberate about on matters of marriage, parenting, and familial intimacy. Professor Ertman’s writings on contract (The Business of Intimacy,  What’s Wrong With a Parenthood Market?, and Reconstructing Marriage to name a few) precede the book, and presaged its birth.

Here for example, in a passage from Chapter Eight, she explains that “[i]t takes two more trips to the lawyer’s office to hammer out terms that satisfy Karen, Victor, the attorney, and me, from lawyerly technicalities to the emotional terms we call “mush.” From what started out as an addendum to Victor’s and my coparenting agreement has blossomed into a bouquet of wills and powers of attorney, alongside the amended parenting agreement.” She tells readers, “On the way downstairs, clutching documents still warm from the copying machine, Karen squeezes my hand, as if she too feels that signing all those dotted lines brought a family into being every bit as much as vows of forever that we plan to recite…” As she explains, “if you scratch the surface of marriage—straight or gay—you’ll find contracts there, too.”

Professor Ertman urges us to remember time and again that what builds relationships and sustains them are the formal and informal contracting that take place daily in marriage; they establish the foundation for marriage and what comes after. She works diligently in the book to demonstrate love too undergirds contracts. That is to say, she wants readers to reimagine contracts—not as the products of cold, calculated bargaining or business arrangements—though one must acknowledge contracts can be that too—even in marriage.  Often marriage is the product of love, intimacy, and warm innocence.  At other times, it is the product of business arrangements.  It was that too in the U.S. chattel system: contracts that gave legal sufficiency to the buying, selling, bartering, and even destroying of slaves, including children (among them the Black biological offspring of slave owners). In light of that history yet to be fully explored and appreciated in law, it is a formidable task to resituate or reintroduce contract in the space of families and intimacy. However, Professor Ertman rises to that challenge.

Like it or not, contracts pervade marriage and suffuse premarital agreements. Sometimes contracting in this regard attempts to resituate power and status expost marriage, providing the economically weaker spouse economic stability after the breakup. Martha highlights cases from that of Catherine Simeone who received a “raw deal,” to those of celebrities, including Michael Douglas and Beyonce. Who knew that Beyonce would receive $5 million for “each of their children,” if she and Shawn Carter (otherwise known as Jay-Z) divorced? Professor Ertman might argue that despite the businesslike nature of contracts, these legal arrangements and agreements make most matters clearer for everybody. Professor Ertman explains that contracts and even verbal agreements provide information, they can provide context, and they offer choice.

In Ertman’s life, it was a contract that bestowed her wife, Karen, parenthood of their child—not something biological, legislative, or derived from courts. And she offers multiple reasons for readers to consider the salience of contracts in intimacy, including voluntariness, reciprocal promises, and equal status. She offers an additional reason: love’s promises.

Privacy Security Novels 02
1

5 Great Novels About Privacy and Security

I am a lover of literature (I teach a class in law and literature), and I also love privacy and security, so I thought I’d list some of my favorite novels about privacy and security.

I’m also trying to compile a more comprehensive list of literary works about privacy and security, and I welcome your suggestions.

Without further ado, my list:

Franz Kafka, The Trial

Kafka’s The Trial begins with a man being arrested but not told why. In typical Kafka fashion, the novel begins badly for the protagonist . . . and then it gets worse! A clandestine court system has compiled a dossier about him and officials are making decisions about him, but he is left in the dark. This is akin to how Big Data can operate today. The Trial captures the sense of helplessness, frustration, and powerlessness when large institutions with inscrutable purposes use personal data and deny people the right to participate. I wrote more extensively about how Kafka is an apt metaphor for privacy in our times in a book called The Digital Person about 10 years ago.

Franz Kafka The Trial

 

Read More