Category: Privacy (Law Enforcement)

2

Big Data Brokers as Fiduciaries

In a piece entitled “You for Sale,” Sunday’s New York Times raised important concerns about the data broker industry.  Let us add some more perils and seek to reframe the debate about how to regulate Big Data.

Data brokers like Acxiom (and countless others) collect and mine a mind-boggling array of data about us, including Social Security numbers, property records, public-health data, criminal justice sources, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, online musings, browsing habits culled by behavioral advertisers, and the gold mine of drug- and food-store records.  They scrape our social network activity, which with a little mining can reveal our undisclosed sexual preferences, religious affiliations, political views, and other sensitive information.  They may integrate video footage of our offline shopping.  With the help of facial-recognition software, data mining algorithms factor into our dossiers the over-the-counter medicines we pick up, the books we browse, and the pesticides we contemplate buying for our backyards.  Our social media influence scores may make their way into the mix.  Companies, such as Klout, measure our social media influence, usually on a scale from one to 100.  They use variables like the number of our social media followers, frequency of updates, and number of likes, retweets, and shares.  What’s being tracked and analyzed about our online and offline behavior is accelerating – with no sign of slowing down and no assured way to find out.

As the Times piece notes, businesses buy data-broker dossiers to classify those consumers worth pursuing and those worth ignoring (so-called “waste”).  More often those already in an advantaged position get better deals and gifts while the less advantaged get nothing.  The Times piece rightly raised concerns about the growing inequality that such use of Big Data produces.  But far more is at stake.

Government is a major client for data brokers.  More than 70 fusion centers mine data-broker dossiers to detect crimes, “threats,” and “hazards.”  Individuals are routinely flagged as “threats.”  Such classifications make their way into the “information-sharing environment,” with access provided to local, state, and federal agencies as well as private-sector partners.  Troublingly, data-broker dossiers have no quality assurance.  They may include incomplete, misleading, and false data.  Let’s suppose a data broker has amassed a profile on Leslie McCann.  Social media scraped, information compiled, and videos scanned about “Leslie McCann” might include information about jazz artist “Les McCann” as well as information about criminal with a similar name and age.  Inaccurate Big Data has led to individuals’ erroneous inclusion on watch lists, denial of immigration applications, and loss of public benefits.  Read More

0

The Right to Be Forgotten: A Criminal’s Best Friend?

By now, you’ve likely heard about the the proposed EU regulation concerning the right to be forgotten.  The drafters of the proposal expressed concern for  social media users who have posted comments or photographs that they later regretted. Commissioner Reding explained: “If an individual no longer wants his personal data to be processed or stored by a data controller, and if there is no legitimate reason for keeping it, the data should be removed from their system.”

Proposed Article 17 provides:

[T]he data subject shall have the right to obtain from the controller the erasure of personal data relating to them and the abstention from further dissemination of such data, especially in relation to personal data which are made available by the data subject while he or she was a child, where one of the following grounds applies . . . .

Where the controller referred to in paragraph 1 has made the personal data public, it shall take all reasonable steps, including technical measures, in relation to data for the publication of which the controller is responsible, to inform third parties which are processing such data, that a data subject requests them to erase any links to, or copy or replication of that personal data. Where the controller has authorised a third party publication of personal data, the controller shall be considered responsible for that publication.

The controller shall carry out the erasure without delay, except to the extent that the retention of the personal data is necessary: (a) for exercising the right of freedom of expression in accordance with Article 80; (b) for reasons of public interest in the area of public health in accordance with Article 81; (c) for historical, statistical and scientific research purposes in accordance with Article 83; (d) for compliance with a legal obligation to retain the personal data by Union or Member State law to which the controller is subject . . . . Read More

1

BRIGHT IDEAS: Q&A with Bruce Schneier about Liars and Outliers

Bruce Schneier has recently published a new book, Liars and Outliers: Enabling the Trust that Society Needs to Thrive (Wiley 2012).  Bruce is a renowned security expert, having written several great and influential books including Secrets and Lies and Beyond Fear.

Liars and Outliers is a fantastic book, and a very ambitious one — an attempt to conceptualize trust and security.  The book is filled with great insights, and is a true achievement. And it’s a fun read too.  I recently conducted a brief interview with Bruce about the book:

Q (Solove): What is the key idea of your book?

A (Schneier): Liars and Outliers is about trust in society, and how we induce it. Society requires trust to function; without it, society collapses. In order for people to have that trust, other people must be trustworthy. Basically, they have to conform to the social norms; they have to cooperate. However, within any cooperative system there is an alternative defection strategy, called defection: to be a parasite and take advantage of others’ cooperation.

Too many parasites can kill the cooperative system, so it is vital for society to keep defectors down to a minimum. Society has a variety of mechanisms to do this. It all sounds theoretical, but this model applies to terrorism, the financial crisis of 2008, Internet crime, the Mafia code of silence, market regulation…everything involving people, really.

Understanding the processes by which society induces trust, and how those processes fail, is essential to solving the major social and political problems of today. And that’s what the book is about. If I could tie policymakers to a chair and make them read my book, I would.

Okay, maybe I wouldn’t.

Q: What are a few of the conclusions from Liars and Outliers that you believe are the most important and/or provocative?

A: That 100% cooperation in society is impossible; there will always be defectors. Moreover, that more security isn’t always worth it. There are diminishing returns — spending twice as much on security doesn’t halve the risk — and the more security you have, the more innocents it accidentally ensnares. Also, society needs to trust those we entrust with enforcing trust; and the more power they have, the more easily they can abuse it. No one wants to live in a totalitarian society, even if it means there is no street crime.

More importantly, defectors — those who break social norms — are not always in the wrong. Sometimes they’re morally right, only it takes a generation before people realize it. Defectors are the vanguards of social change, and a society with too much security and too much cooperation is a stagnant one.

Read More

0

Stanford Law Review Online: How the War on Drugs Distorts Privacy Law

Stanford Law Review

The Stanford Law Review Online has just published an Essay by Jane Yakowitz Bambauer entitled How the War on Drugs Distorts Privacy Law. Professor Yakowitz analyzes the opportunity the Supreme Court has to rewrite certain privacy standards in Florida v. Jardines:

The U.S. Supreme Court will soon determine whether a trained narcotics dog’s sniff at the front door of a home constitutes a Fourth Amendment search. The case, Florida v. Jardines, has privacy scholars abuzz because it presents two possible shifts in Fourth Amendment jurisprudence. First, the Court might expand the physical spaces rationale from Justice Scalia’s majority opinion in United States v. Jones. A favorable outcome for Mr. Jardines could reinforce that the home is a formidable privacy fortress, protecting all information from government detection unless that information is visible to the human eye.

Alternatively, and more sensibly, the Court may choose to revisit its previous dog sniff cases, United States v. Place and Illinois v. Caballes. This precedent has shielded dog sniffs from constitutional scrutiny by finding that sniffs of luggage and a car, respectively, did not constitute searches. Their logic is straightforward: since a sniff “discloses only the presence or absence of narcotics, a contraband item,” a search incident to a dog’s alert cannot offend reasonable expectations of privacy. Of course, the logical flaw is equally obvious: police dogs often alert when drugs are not present, resulting in unnecessary suspicionless searches.

She concludes:

Jardines offers the Court an opportunity to carefully assess a mode of policing that subjects all constituents to the burdens of investigation and punishment, not just the “suspicious.” Today, drug-sniffing dogs are unique law enforcement tools that can be used without either individualized suspicion or a “special needs” checkpoint. Given their haphazard deployment and erratic performance, police dogs deserve the skepticism many scholars and courts have expressed. But the wrong reasoning in Jardines could fix indefinitely an assumption that police technologies and civil liberties are always at odds. This would be unfortunate. New technologies have the potential to be what dogs never were—accurate and fair. Explosive detecting systems may eventually meet the standards for this test, and DNA-matching and pattern-based data mining offer more than mere hypothetical promise. Responsible use of these emerging techniques requires more transparency and even application than police departments are accustomed to, but decrease in law enforcement discretion is its own achievement. With luck, the Court will find a search in Jardines while avoiding a rule that reflexively hampers the use of new technologies.

Read the full article, How the War on Drugs Distorts Privacy Law by Jane Yakowitz Bambauer, at the Stanford Law Review Online.

1

Cybersecurity Legislation and the Privacy and Civil Liberties Oversight Board

Along with a lot of other privacy folks, I have a lot of concerns about the cybersecurity legislation moving through Congress.  I had an op-ed in The Hill yesterday going through some of the concerns, notably the problems with the over broad  “information sharing” provisions.

Writing the op-ed, though, prompted me to highlight one positive step that should happen in the course of the cybersecurity debate.  The Privacy and Civil Liberties Oversight Board was designed in large part to address information sharing.  This past Wednesday, the Senate Judiciary Committee had the hearing to consider the bipartisan slate of five nominees.

Here’s the point.  The debate on CISPA and other cybersecurity legislation has highlighted all the information sharing that is going on already and that may be going on in the near future.  The PCLOB is the institution designed to oversee problems with information sharing.  So let’s confirm the nominees and get the PCLOB up and running as soon as possible.

The quality of the nominees is very high.  David Medine, nominated to be Chair, helped develop the FTC’s privacy approach in the 1990’s and has worked on privacy compliance since, so he knows what should be done and what is doable.  Jim Dempsey has been at the Center of Democracy and Technology for over 15 years, and is a world-class expert on government, privacy, and civil liberties.  Pat Wald is the former Chief Judge of the DC Circuit.  Her remarkably distinguished career includes major experience on international human rights issues.  I don’t have experience with the other two nominees, but the hearing exposed no red flags for any of them.

The debates about cybersecurity legislation show the centrality of information sharing to how government will respond to cyber-threats.  So we should have the institution in place to make sure that the information sharing is done in a lawful and sensible way, to be effective and also to protect privacy and civil liberties.

4

Stanford Law Review Online: The Dead Past

Stanford Law Review

The Stanford Law Review Online has just published Chief Judge Alex Kozinski’s Keynote from our 2012 Symposium, The Dead Past. Chief Judge Kozinski discusses the privacy implications of our increasingly digitized world and our role as a society in shaping the law:

I must start out with a confession: When it comes to technology, I’m what you might call a troglodyte. I don’t own a Kindle or an iPad or an iPhone or a Blackberry. I don’t have an avatar or even voicemail. I don’t text.

I don’t reject technology altogether: I do have a typewriter—an electric one, with a ball. But I do think that technology can be a dangerous thing because it changes the way we do things and the way we think about things; and sometimes it changes our own perception of who we are and what we’re about. And by the time we realize it, we find we’re living in a different world with different assumptions about such fundamental things as property and privacy and dignity. And by then, it’s too late to turn back the clock.

He concludes:

Judges, legislators and law enforcement officials live in the real world. The opinions they write, the legislation they pass, the intrusions they dare engage in—all of these reflect an explicit or implicit judgment about the degree of privacy we can reasonably expect by living in our society. In a world where employers monitor the computer communications of their employees, law enforcement officers find it easy to demand that internet service providers give up information on the web-browsing habits of their subscribers. In a world where people post up-to-the-minute location information through Facebook Places or Foursquare, the police may feel justified in attaching a GPS to your car. In a world where people tweet about their sexual experiences and eager thousands read about them the morning after, it may well be reasonable for law enforcement, in pursuit of terrorists and criminals, to spy with high-powered binoculars through people’s bedroom windows or put concealed cameras in public restrooms. In a world where you can listen to people shouting lurid descriptions of their gall-bladder operations into their cell phones, it may well be reasonable to ask telephone companies or even doctors for access to their customer records. If we the people don’t consider our own privacy terribly valuable, we cannot count on government—with its many legitimate worries about law-breaking and security—to guard it for us.

Which is to say that the concerns that have been raised about the erosion of our right to privacy are, indeed, legitimate, but misdirected. The danger here is not Big Brother; the government, and especially Congress, have been commendably restrained, all things considered. The danger comes from a different source altogether. In the immortal words of Pogo: “We have met the enemy and he is us.”

Read the full article, The Dead Past by Alex Kozinski, at the Stanford Law Review Online.

0

Facebook Subpoenas, Open Court Records, Here We Go Again

The Boston Phoenix has an article about what Facebook coughs up when a subpoena is sent to the company. The paper came across the material as it worked on an article called Hunting the Craigslist Killer. The issues that come to mind for me are

1. Privacy after death? In may article Property, Persona, and Preservation which uses the question of who owns email after death, I argue that privacy after death isn’t tenable. The release of information after someone dies (the man committed suicide), (From ZDNET “he man committed suicide, which meant the police didn’t care if the Facebook document was published elsewhere, after robbing two women and murdering a third.”) brings up a question Dan Solove and I have debated. What about those connected to the dead person? The facts here matter.

2. What are reasons to redact or not release information? Key facts about redaction and public records complicate the question of death and privacy. I’m assuming the person has no privacy after death. But his or her papers may reveal information about those connected to the dead person. In this case the police did not redact, but the paper did. Sort of.

This document was publicly released by Boston Police as part of the case file. In other case documents, the police have clearly redacted sensitive information. And while the police were evidently comfortable releasing Markoff’s unredacted Facebook subpoena, we weren’t. Markoff may be dead, but the very-much-alive friends in his friend list were not subpoenaed, and yet their full names and Facebook ID’s were part of the document. So we took the additional step of redacting as much identifying information as we could — knowing that any redaction we performed would be imperfect, but believing that there’s a strong argument for distributing this, not only for its value in illustrating the Markoff case, but as a rare window into the shadowy process by which Facebook deals with law enforcement.

As the comments noted and the explanation admits, the IDs and other information of the living are arguably in greater need of protection. It may have been that the police needed all the information for its case, but why release it to the public?

Obvious Closing: As we put more into the world, it will come back in ways we had not imagined. I doubt that bright line rules will ever work in this space. But it seems to me that some sort of best practices informed by research (think Lior Strahilevitz’s A Social Networks Theory of Privacy) could allow for reasonable, useful privacy practices. The hardest part for law and society in general is that this area (information-related law) is not likely to be stable for some time. That being said, I think that the insane early domain name law (yes someone could think that megacorpsucks.com is sponsored by megacorp) corrected in about 10 years. Perhaps privacy and information practices will reach an equilibrium that allows the law to stabilize. Until then, practices, businesses, science, and the law will twirl around each other as society sorts what balance makes sense (until something messes with that moment).

HT: CyberNetwork News

0

Stanford Law Review, 64.2 (2012)

Stanford Law Review

Volume 64 • Issue 2 • February 2012

Articles
National Security Federalism in the Age of Terror
Matthew C. Waxman
64 Stan. L. Rev. 289

Incriminating Thoughts
Nita A. Farahany
64 Stan. L. Rev. 351

Elective Shareholder Liability
Peter Conti-Brown
64 Stan. L. Rev. 409

Note
Harrington’s Wake:
Unanswered Questions on AEDPA’s Application to Summary Dispositions

Matthew Seligman
64 Stan. L. Rev. 469

Comment
Boumediene Applied Badly:
The Extraterritorial Constitution After Al Maqaleh v. Gates

Saurav Ghosh
64 Stan. L. Rev. 507

0

Dockets and Data Breach Litigation

Alessandro Acquisti, Sasha Romanosky, and I have a new draft up on SSRN, Empirical Analysis of Data Breach Litigation.  Sasha, who’s really led the charge on this paper, has presented it at many venues, but this draft is much improved (and is the first public version).  From the abstract:

In recent years, a large number of data breaches have resulted in lawsuits in which individuals seek redress for alleged harm resulting from an organization losing or compromising their personal information. Currently, however, very little is known about those lawsuits. Which types of breaches are litigated, which are not? Which lawsuits settle, or are dismissed? Using a unique database of manually-collected lawsuits from PACER, we analyze the court dockets of over 230 federal data breach lawsuits from 2000 to 2010. We use binary outcome regressions to investigate two research questions: Which data breaches are being litigated in federal court? Which data breach lawsuits are settling? Our results suggest that the odds of a firm being sued in federal court are 3.5 times greater when individuals suffer financial harm, but over 6 times lower when the firm provides free credit monitoring following the breach. We also find that defendants settle 30% more often when plaintiffs allege financial loss from a data breach, or when faced with a certified class action suit. While the compromise of financial information appears to lead to more federal litigation, it does not seem to increase a plaintiff’s chance of a settlement. Instead, compromise of medical information is more strongly correlated with settlement.

A few thoughts follow after the jump.

Read More

2

Operation Virtual Shield (aka Persistent Video Surveillance Coming Soon)

According to Government Technology, a network of public and private surveillance cameras increasingly monitors our daily lives.  Chicago’s Police Department’s network, called “Operation Virtual Shield,” directs video feeds from roughly 10,000 privately-owned cameras and roughly 10,000 public-sector cameras to law enforcement personnel.  That includes more than 4,500 cameras in Chicago public schools, 3,000 cameras in public housing, and 1,000 camera at O’Hare Airport.  Atlanta’s Video Integration Center similarly uses feeds from the private sector, soon possibly including feeds from the CNN Center. Pre-existing agreements –memoranda of understanding — facilitate the arrangement.  And what luck for law enforcement, according to Chicago’s managing deputy director of public safety: “If the police wanted the video and the private facility owner didn’t want to hand it over, there’d have to be some kind of a court order of subpoena.  With the agreements in place, obviously we’ve got an inventory of cameras by location.  It save lots of time as a forensics too as well.”  Now, there’s no need to bother with court orders or subpoenas.  Just sign the agreement and it’s frictionless sharing, much as may soon be possible in the private sector with changes to the Video Privacy Protection Act. These “Virtual Shield” feeds likely make their way into fusion centers, raising concerns about oversight and civil liberties as my co-blogger Frank Pasquale and I addressed in Network Accountability for the Domestic Intelligence Apparatus.  The cameras are expensive and their efficacy isn’t entirely clear.  Season 4 of the Wire brought home the limitations of cameras: Snoop knocked out a Baltimore city camera and then proceeded into a house to kill someone.  Of course, if we put up cameras everywhere, it may be difficult for criminals to knock them all down.  That may just be the future for Operations Virtual Shield.

Image: Wikimedia Commons