No More Secret Dossiers: We Need Full FTC or CFPB Investigation of “Fourth Bureau” Reputation Intermediaries
There is a superb article by Ylan Q. Mui on the growth of new firms that create consumer reputations. They operate outside the traditional regulation of the three major credit bureaus. Mui calls this shadowy world of reputational intermediaries the “fourth bureau.” The Federal Trade Commission should conduct an immediate investigation of the “black box” practices described by an industry leader in the article. This should be part of a larger political and social movement to stop the collection of “secret dossiers” about individuals by corporate entities. The Murdoch scandal now unraveling in Britain is only the most extreme example of a wholesale assault on privacy led by unscrupulous data collectors.
Once a critical mass of data about a person has been collected for a commercial purpose, she deserves to know what the data is and who is gathering it. Once an educator, employer, landlord, banker, or insurer makes a decision based on that data, the affected individual should be able to challenge and correct it. I have made a preliminary case for such reforms in my chapter Reputation Regulation, in this book. I now think this agenda is more urgent than ever, given the creeping spread of unaccountable data mining in the internet sector to a wild west of reputational intermediaries.
From a Fair Credit Reporting Act to a Fair Reputation Reporting Act
To understand why, it’s helpful to take a step back and look at how poorly regulated even the established credit bureaus are. As Shawn Fremstad and Amy Traub have noted in the Demos report Discrediting America, ample empirical evidence has confirmed that a vast number of traditional credit bureau files are erroneous:
A 2008 Federal Trade Commission (FTC)-sponsored pilot study found that about 31 percent of people who reviewed their credit report found errors that they wanted to dispute. About 11 percent of people reported errors that were categorized by the FTC as “material”, i.e. errors that significantly affected credit scores…A 2011 study funded by the credit reporting industry and conducted by the Policy & Economic Research Council (PERC) was larger and more representative, finding that 19.2 percent of people who reviewed their credit reports identified information that appeared to be erroneous. 12.1 percent reported apparent errors that could have a material impact—mistakes that go beyond a misspelled name or incorrect address. (emphasis added)
Correcting such errors is a laborious and frustrating process. As one consumer lawyer has stated, “The legal responsibility of the credit reporting agencies and of the creditors is well established. . . . There is a requirement that they do meaningful research and analysis, and it is almost never done.” The bureaus also “have a two-tiered system for resolving errors — one for the rich, the well-connected, the well-known and the powerful, and the other for everyone else.” But at least they must make some response to the persons whose credit histories they maintain, and they must make these histories available to individuals.
For many entities in the fourth bureau, such elementary protections of transparency and due process are not available. One “collects account information for 63 of that industry’s largest firms — although the group’s director won’t specify which ones.” The data then appears to populate—and to help produce results from—a system that would have made the Star Chamber blush:
These dossiers go into what the industry calls a “black box” — a veil of secrecy surrounding the origins of the information, how it is analyzed and who buys it. Consumers have no voice in those decisions, even though the information concerns their lives. The data could help struggling borrowers prove they are ready for the financial mainstream. But the data can also penalize them for actions they didn’t realize were being tracked, forcing them to pay far higher interest rates or more fees.
Out of the black box comes a credit score that can be sold not only to lenders, but also colleges making tuition decisions, landlords choosing tenants or health-care providers determining financial aid. Every score out of the black box can be tailored for each of these buyers, even if it’s about the same person. “It’s kind of like buying a tailor-fitted suit,” said [one executive]. “When you build a custom model for your client, it just tends to fit better.”
A business model based on secret scoring practices takes some inspiration from FICO, an entity that has licensed scoring methods to the three major credit bureaus for years. FICO scoring is protected by trade secrecy. A large literature and FICO’s own ScoreSimulator (accessible for $19.95) can help individuals predict how various actions have affected or will affect their FICO score. FICO has also stated in Congressional testimony that it, like fourth bureau agencies, customizes scores for various clients, so it’s unclear exactly how relevant the one FICO score given to consumers via its website is to the various contexts which it might be adjusted for. But FICO has never publicly stated that it, like the fourth bureau agencies, bases its scores on secret data hidden from scored individuals. This is a dangerous precedent that the FTC must address.
There have been other secret reputation services, but they differ from the “fourth bureau” in important ways. Avvo, a lawyer rating site, did not disclose how it rated lawyers, but did at least offer rated lawyers a chance to review the record the rating was based on. Game or virtual reality sites like CyWorld will rate players’ “karma” or “friendliness” in proprietary ways, but these ratings do not affect individuals’ efforts to gain access to basic consumer products or necessities of life. The utility-specific credit reporting agency (the National Consumer Telecom and Utilities Exchange (NCTUE)) is very different, as Demos has documented:
Many utility companies base service and deposit decisions on utility-payment histories provided by the . . . NCTUE, which is essentially a utility-specific credit reporting agency operated by Equifax. Although NCTUE is basically a specialized credit reporting agency, few consumers applying for utility services are likely to even know that they have a NCTUE file that will be checked by the utility. Unlike other credit reporting agencies, NCTUE does not clearly provide individuals with a free copy of their reports upon request.
Moreover, in recent Congressional testimony, privacy expert Evan Hendricks noted that “it is not clear whether [utility companies using NCTUE reports are] providing ‘adverse action’ notices [required by the Fair Credit Reporting Act] to consumers so they’d know they were negatively affected by a NCTUE report.”
As Demos notes, “TXU Energy, the largest retail electrical provider in Texas, announced that it would charge differential rates based on customer’s credit,” until the Texas PUC intervened. So we are basically talking about a black box reputation intermediary stigmatizing certain individuals without their knowledge, and making it impossible or very expensive for them to, say, get air conditioning in a hot Texas summer. Or imagine poor, elderly people dependent on oxygen machines or other monitoring devices, suddenly charged exorbitant fees or deposits when they move residences. These are truly remarkable powers for “black box” reputational intermediaries to enjoy.
A Model Response from the New York Attorney General
In the years when the Bush administration’s finance regulators were “out to lunch” (or playing with chainsaws), NY AG Eliot Spitzer filled some of the vast vacuum they left behind. After Spitzer was toppled by scandal, Andrew Cuomo spearheaded many valuable initiatives. One of these responded to insurance companies that had created “black box” evaluation, ranking, and rating systems for doctors. His office launched an investigation of insurers’ physician rating which culminated in settlement agreements in 2007.
Cuomo claimed that the evaluation programs were confusing and unfair to both physicians and consumers. After negotiating with his office, insurance companies eventually agreed to follow the ranking guidelines in a national model (in cooperation and consultation with the American Medical Association and other provider trade organizations). The model agreements require “insurers to fully disclose to consumers and physicians all aspects of their ranking system.” Cuomo has advocated the codification of the model based on his written agreements with insurance companies. A Patient Charter for Physician Performance Measurement has also emerged as a project of the Consumer-Purchaser Disclosure Project (“CPDP”). The specific terms of the charter call for evaluations that are “meaningful to consumers” and bar decontextualized ratings based solely on cost.
Professor Kristin Madison has expertly surveyed and analyzed a range of physician quality measures, and describes the New York settlements in this way:
The agreements emphasize the importance of transparency, stressing that the methodology and data used must be fully disclosed . . . . The settlements embrace a model code for “physician performance measurement, reporting and tiering programs” based on the “core principles” of “accuracy and transparency of information, oversight of the process, and fairness in comparison of physicians.” The accuracy and transparency provisions include numerous requirements with respect to the nature of performance measurement. For example, the agreements require that measures of cost-efficiency be calculated and disclosed separately from measures of quality.
While they may in addition be combined to create a single ranking, the weight attributed to each portion must be disclosed. In short, parties to the agreements will no longer be able to designate a “high performance” tier that mixes quality and cost considerations, leaving consumers to wonder about what exactly “high performance” might mean. The agreements further specify that insurers “shall not conduct rankings based solely on cost-efficiency, but shall consider quality dimensions.” . . . [T]he agreements [also] require insurers to contract with a monitoring entity known as a “Ratings Examiner,” [whose] . . . task is to review the insurer’s rating programs with respect to all settlement agreement provisions.
Though Madison worried that the initial complaints against physician quality measurements undervalued insurers’ efforts, she reports that a broad range of stakeholders contributed to the final NY settlement.
Applying Principles of Transparency, Accuracy, and Relevance
Are there lessons here for the “fourth bureau?” I believe that the principles of the New York settlement should be broadly applied to all commercial reputational intermediaries. Open your data, open your methodology, and give your investigatees due process. Nearly everyone’s digital persona is sliced, diced, and redefined on the basis of Internet data and clandestine sorting systems, and we have little sense of how marketers, law enforcement officials, or other decision makers are using the data endlessly gathered and repackaged online. We need to know, now.
Unfortunately, when we try to protect our privacy and reputation by claiming some right to understand how online profiles are created and shared, we will likely find corporations countering that their own privacy protections—the propertized confidences of trade secrecy—should prevent public scrutiny of the algorithms they use to generate online reputations. But there are ways to neutralize the trade secret shield. Cases involving traditional credit scoring have already allowed extensive discovery of methods. A dedicated government agency could also do independent analysis in a confidential way, as I describe in Part IV of this article, “Beyond Innovation and Competition,” which examined search engines and carriers.
In “Beyond Innovation and Competition,” I was very careful to emphasize the social importance of respecting legitimate needs for preserving the confidentiality of innovators’ trade secret protected algorithms. In the case of reputation intermediaries, I am much less concerned. When a business model impinges so directly on the privacy of individuals, its pursuer does not deserve the privacy it is doing so much to disrupt in the lives of others. If “fourth bureau” entities want to make our lives an open book (or, more accurately, paint pictures of us that they claim are accurate predictors of our intent and character), they must be open as well. As David Brin argued in The Transparent Society, there is no other way to preserve freedom and democracy in an era of declining personal privacy. The watchers must be watched.
Few individuals realize just how long the struggle to rein in unaccountable reputational intermediaries has been. Consider this, just one of many incidents related at a 1970s Senatorial hearing:
Last year an Oklahoma man went into the Tulsa office of Retail Credit Co. to find out why he had been denied insurance. He brought with him Paul Polin, a management consultant who has been leading the fight for regulation of credit bureaus since 1960. They found 12 errors in the report, as well as obsolete information that the law requires to be deleted. Next day the manager of the branch office phoned the complainant at home and told him: “If you stop associating with Paul Polin, we’ll make sure you have an A-1 report.”
To be sure, the fourth bureau entities mentioned in Mui’s article are too savvy to make such blunt threats. But their black box methods can be even more devastating. Consider this incident:
Arkansas resident Catherine Taylor didn’t learn about the fourth bureau until she was denied a job at her local Red Cross several years ago. Her rejection letter came with a copy of her file at a firm called ChoicePoint that detailed criminal charges for the intent to sell and manufacture methamphetamines. The information was incorrect — she says the charges were for another woman with the same name and birth date — but it has haunted her ever since. Taylor said she has identified at least 10 companies selling reports with the inaccurate personal and financial information, wrecking her credit history so badly that she says she cannot qualify to purchase a dishwasher at Lowe’s. Taylor must apply for loans under her husband’s name and has retained an attorney to force the firms to correct the record. She has settled one case, and a trial in another is expected next week.
Catherine Taylor said the errors in her files have persisted despite several attempts to correct them. . . . It took Taylor four years to find a job after she was rejected from the Red Cross. Taylor said she has been turned down for an apartment and now lives in a house purchased through her sister. The stress of dealing with the consequences has exacerbated her diabetes and heart problems, she said.
As Elizabeth De. De Armond has written, “the power of mismatched information . . . to disrupt or even paralyze the lives of individuals has grown dramatically.” This is a system that coldly and foreseeably trashes individual reputations and then counts the occasional lawsuit as a “cost of doing business.” Furthermore, consider what might happen to Ms. Taylor after all this is resolved. What if there are stigmatizing variables in the “black box” scoring process for “litigiousness”? She may well be stigmatized merely for trying to stop her stigmatization.
Lest this seem like paranoid speculation, consider the fate of a homeowner who demanded to know who owned the stream of payments due from his mortgage. One would think that a sensible credit scoring system would reward those who taken the trouble to demand this information about their mortgage. Unfortunately, precisely the opposite occurred in at least one case. One homeowner who followed all the instructions on the “Where’s the Note” website (detailing his rights under RESPA) experienced a “40 point hit” on a credit score. In the Heisenberg-meets-Kafka world of credit scoring, merely trying to figure out possible effects on one’s score can reduce it. FICO has at least acquiesced to regulation from the Federal Reserve and FTC. Who’s to tell what the clandestine “fourth bureau” agencies will do to those who question them?
Beyond the Paper Tiger
According to Mui, “rules that took effect this year” may help solve the problem by “requiring lenders to explain to consumers why they are denied credit or didn’t receive the best interest rate.” But if these explanations are anything like the “reason codes” used in the context of FICO scores and credit reporting, they will not be helpful. The FCRA currently requires up to four key factors adversely affecting a consumer’s credit score to be disclosed in the scoring report. Phrases like “type of bank accounts” and “type of credit references” are etiolated symbols, more suited for machine-to-machine interaction than personal explanation. Factors such as “too many revolving accounts” and “late payment” are a commonplace even for those with high credit scores. The law does not require credit scorers to tell individuals how much any given factor mattered to a particular score. The industry remains highly opaque, with scored individuals unable to determine the exact consequences of their decisions.
Recent FTC action against unaccountable reputation intermediaries in the pharmacy records field is also not encouraging. Health, life, and long-term care insurers have long used pharmacy data gathered by companies MedPoint and IntelliScript in order to deny coverage to individuals with certain medical histories, or to jack up rates. (Mental health issues have been a particular red flag.) As Chad Terhune reported, a 2007 FTC investigation “found that the two companies supplying these pharmacy profiles violated federal law for years by keeping the system hidden from consumers.” In a settlement resulting from that case, the FTC found the following:
The medical profile generated by MedPoint is a consumer report as that term is defined in Section 603(d) of the Fair Credit Reporting Act, 15 U.S.C. § 1681a(d), because it bears on a consumer’s credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living, which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing a consumer’s eligibility for credit or insurance.
But the agencies’ settlements did little more than require a disclosure if prescription information had motivated a denial of coverage or some other adverse action, and recognition from settling entities that the FCRA applied to them. No substantial penalties were imposed.
It is well beyond time for the FTC, or the new CFPB, to fully investigate the fourth bureau agencies. Good first steps would include a) cataloging them all, and, for a sample of them, b) finding out how many of their reports are erroneous, and c) assessing how responsive they are to investigatee complaints.
Why You Should Care
Mike Konczal recently noted that “the FTC has given a thumbs up to the Social Intelligence Corp. archiving seven years worth of people’s Facebook posts, posts that can then be used as part of their background checking service for job applicants.” He then reflected on the larger set of inequalities generated by an out-of-control private surveillance apparatus:
Credit reporting scores play a huge role in society but they have huge oversight gaps and major, systemic errors that bias against consumers. . . . [A Walzerian concept of] justice calls for multiple spheres of [influence] to keep each other in check[, and for no one sphere to overly influence another]. . . [N]obody should be precluded a social good y because on their lack of possession of an unrelated good x. That the sloppiness of credit scores, the protection of bankruptcy against bad debts, the brute luck of bad health, etc. could all preclude someone from obtaining basic utilities and access to productive labor – that inequality in net worth, health and other spheres preclude access to the sphere of labor regardless of one’s abilities – is something to be fought tooth-and-nail.
[Moreover,] [h]ard and soft forms of behavioral surveillance technology are maturing into things legible and deployable for employers and recruiters at exactly the moment when employees have little-to-no aggregate bargaining power. The unemployed . . . have to be willing to jump through a lot of hoops to secure work, and right now opening up the most revealing parts of their private lives are on the table for what is relevant for an employer to see.
[Finally,] as Jack Balkin warns, private firms can often be used to end-run constitutional protections against state searches and this is all part of a larger issue that goes beyond the War on Terror. These problems, of who has access to your information and what they are allowed to do with it, are only going to get bigger as algorithms and technology becomes more sophisticated.
True on all counts. In The Politics of Recognition, Charles Taylor explored the claims of individuals who felt that they were treated unfairly—or, worse, degraded and subordinated—on account of their ethnic identity. Taylor advanced discussion of multiculturalism by articulating the harm of misrecognition—of being understood by others in an untrue or insultingly unflattering light. For example, women are routinely treated unfairly (and even brutally) solely on the basis of their gender. Those dogged by digital scarlet letters may find whole new modes of discrimination blocking their professional or personal advance. Danielle Citron’s work has compellingly chronicled the problems caused by these “digital scarlet letters,” and the lack of “technological due process” for those caught in the maw of modern data mining’s harrows.
Of course, employers, colleges, and banks have a right to reject or approve applications as they see fit. But while it is one thing to be judged about a fault one knows about, it is a different experience altogether to have no idea what the basis of a negative judgment is. The credit bureaus and their partner-in-scoring, FICO, are already too lightly regulated. If fourth bureau agencies are given a “get out of oversight free” card by sluggish consumer protection bureaus, we can expect more private surveillance to shift to them. Data gatherers will become even more aggressive. No one who cares about privacy, transparency, or equality should welcome that future.
Image Credit: Still from film adaptation of Kafka’s The Trial.