Big Data Brokers as Fiduciaries
In a piece entitled “You for Sale,” Sunday’s New York Times raised important concerns about the data broker industry. Let us add some more perils and seek to reframe the debate about how to regulate Big Data.
Data brokers like Acxiom (and countless others) collect and mine a mind-boggling array of data about us, including Social Security numbers, property records, public-health data, criminal justice sources, car rentals, credit reports, postal and shipping records, utility bills, gaming, insurance claims, divorce records, online musings, browsing habits culled by behavioral advertisers, and the gold mine of drug- and food-store records. They scrape our social network activity, which with a little mining can reveal our undisclosed sexual preferences, religious affiliations, political views, and other sensitive information. They may integrate video footage of our offline shopping. With the help of facial-recognition software, data mining algorithms factor into our dossiers the over-the-counter medicines we pick up, the books we browse, and the pesticides we contemplate buying for our backyards. Our social media influence scores may make their way into the mix. Companies, such as Klout, measure our social media influence, usually on a scale from one to 100. They use variables like the number of our social media followers, frequency of updates, and number of likes, retweets, and shares. What’s being tracked and analyzed about our online and offline behavior is accelerating – with no sign of slowing down and no assured way to find out.
As the Times piece notes, businesses buy data-broker dossiers to classify those consumers worth pursuing and those worth ignoring (so-called “waste”). More often those already in an advantaged position get better deals and gifts while the less advantaged get nothing. The Times piece rightly raised concerns about the growing inequality that such use of Big Data produces. But far more is at stake.
Government is a major client for data brokers. More than 70 fusion centers mine data-broker dossiers to detect crimes, “threats,” and “hazards.” Individuals are routinely flagged as “threats.” Such classifications make their way into the “information-sharing environment,” with access provided to local, state, and federal agencies as well as private-sector partners. Troublingly, data-broker dossiers have no quality assurance. They may include incomplete, misleading, and false data. Let’s suppose a data broker has amassed a profile on Leslie McCann. Social media scraped, information compiled, and videos scanned about “Leslie McCann” might include information about jazz artist “Les McCann” as well as information about criminal with a similar name and age. Inaccurate Big Data has led to individuals’ erroneous inclusion on watch lists, denial of immigration applications, and loss of public benefits.
Employers also use data-broker dossiers in making interviewing and hiring decisions. Much like law enforcement’s use of digital dossiers, people rarely find out that their dossiers include misleading information. In 2009, data broker ChoicePoint provided an employer with a dossier on a Georgia man that falsely asserted that he has felony convictions. The employer refused to hire the man and, unlike the vast majority of employers acting on such information, explained the reason to him. Only after a congressman from Georgia contacted ChoicePoint was the false information removed from the man’s file.
Data-broker databases are gold mines for hackers. Leaked dossiers are costly for individuals who suffer identity theft, medical identity theft, fraud, and stalking. Individuals are saddled with debts and medical histories not their own, costing them jobs in the process.
The data broker industry operates largely without regulatory oversight (except on rare occasions when FTC intervenes against a company making promises about security and privacy that it fails to keep). That’s also true of Big Data amassed by other private entities, with some narrow exceptions including health, financial, and video records.
For the most part, regulators and privacy advocates have conceptualized the issue as one of consumer autonomy—that is, consumers ought to have the opportunities to see what brokers are storing and to correct them. This treats the enormous power of these large, interconnected, and rapidly proliferating databases as somehow preventable. Sadly, experience to date has shown these hopes to be naïve, with formalistic remedies proving no match for very real practical harms.
Although consumer autonomy is a noble goal that may be achievable in isolated instances, regulators and scholars must broaden their field of view. We must turn instead to the body of law regulating dangerous but practically inevitable power imbalances: fiduciary law. Lawyers inevitably begin with a huge advantage in information and power over their clients; corporate officials are well-positioned to dominate shareholders; guardians can dominate persons lacking full mental and legal competence. Various doctrines quite appropriately seek to maximize the autonomy of those at a disadvantage in those relationships, the law recognizes that substantial imbalances will likely remain.
Both private and public law hold fiduciaries to much higher standards of responsibility than presumptive equals operating at arm’s length. They are subject to greater standards of care than the ordinary reasonable person, their ability to contract with those over whom they have power is limited in important ways, and certain high-risk conduct is prohibited outright.