Biometric Databases and Quantitative Privacy

The new $1 billion Next Generation Identification (NGI) system is now in its roll out phase. NGI–a joint project of federal, state, and local law enforcement and other agencies — is a nationwide network of databases containing images of the body’s characteristics, such as fingerprints, iris, retina, voice, and face.  Here is a little primer on how biometric systems work (see my SoCal Reservoirs of Danger article).  Databases store images of biometric information, either as pictures or mathematical formulas of images called templates.  The biometric system matches an individual’s fingerprint, for instance, with an image or template stored in databases.  Aside from governmental forays into biometric collection and use (which are many), private biometric providers hold templates of millions of individuals.  Elementary schools, airports, gas providers, grocery stores, health clubs, workplaces, and even Disney’s theme parks collect iris scans and fingerprints to secure access to physical plants and/or accounts.  Companies reportedly are creating central clearinghouses of biometric information for commercial use.

According to Assistant Director Tom Bush of the Criminal Justice Information Services Division, NGI  is a “state-of-the-art identification system that will be bigger, faster, and better than IAFIS (Integrated Automated Fingerprint Identification System).”  It is “bigger” because it will increase the capacity of fingerprint storage plus house multimodal biometrics records like palm prints and iris scan and have room to accommodate future biometric technologies (i.e., voice, gait, etc.) as they become available.  It is “faster” because it will speed up response time for high priority criminal ten-print submissions from two hours to about 10 minutes on average. It is “better” because going beyond fingerprints as biometric identifiers will enhance the investigative and identification processes. Adding palm prints makes sense, according to Bush, because latent prints left behind by criminals at crime scenes are often palm prints. NGI is also being developed “to be compatible with other U.S. biometric systems and potentially with those of some foreign partners.”

The FBI’s NGI website proclaims that its many virtues include:

Interstate Photo System Enhancements


Closeup photo of an arm tattoo.  Currently, the IAFIS can accept photographs (mugshots) with criminal ten-print submissions. The Interstate Photo System (IPS) will allow customers to add photographs to previously submitted arrest data, submit photos with civil submissions, and submit photos in bulk formats. The IPS will also allow for easier retrieval of photos, and include the ability to accept and search for photographs of scars, marks, and tattoos. In addition, this initiative will also explore the capability of facial recognition technology.

Multimodal Biometrics


The future of identification systems is currently progressing beyond the dependency of a unimodal (e.g., fingerprint) biometric identifier towards multimodal biometrics (i.e., voice, iris, facial, etc.). The NGI Program will advance the integration strategies and indexing of additional biometric data that will provide the framework for a future multimodal system that will facilitate biometric fusion identification techniques. The framework will be expandable, scalable, and flexible to accommodate new technologies and biometric standards, and will be interoperable with existing systems. Once developed and implemented, the NGI initiatives and multimodal functionality will promote a high level of information sharing, support interoperability, and provide a foundation for using multiple biometrics for positive identification.

Right now, NGI’s biometric data is seemingly obtained from state, local, and federal law enforcement mug shots, DNA databases, driver ID photographs, and, I’m guessing, the information sharing environment.  According to the FBI, a Privacy Impact Assessment has been conducted demonstrating NGI’s compliance with the Privacy Act of 1974 and a Systems of Record Notice has been filed.

Several privacy concerns are clear.  NGI may be capable of searching systems that do not, and need not, comply with the Privacy Act (which includes data minimization requirements, such as purpose limits), or 28 C.F.R. pt. 23 (which requires reasonable suspicion that a person committed a crime before collecting information–a regulation passed post-COINTELPRO).  NGI could access privately-generated biometric databases (e.g., from private security cameras feeds from stores to large commercial biometric providers) to find matches, even though those private systems lack the procedural protections of federal law.  So, too, NGI could (and surely will) access federal/state/local cooperatives known as fusion centers.  As Frank Pasquale and I have explored in our Network Accountability piece published by Hastings Law Journal, fusion centers mine information from information posted online, footage from private security cameras, and systems of private partners. Again, no procedural protections govern the collection of that data.

These possibilities certainly make real the concerns raised on my favorite news aggregator site Slashdot: that this billion dollar system will ensure governmental access to every public traces of faces, fingerprints, palms, and irises, whether online or offline, for investigations and in their efforts to identify “threats, crimes, and hazards”–the mission of most fusion centers.  Ultimately, technologies like NGI raise the kinds of indiscriminate and highly scalable surveillance that raises the specter of a surveillance state, concerns that David Gray and I tackle in an article we recently posted on SSRN.  The constitutional implications of mass quantities of data were at the heart of five Justices’ concurrences in United States v. Jones.  David Gray and I propose a technology-centered approach to quantitative privacy, one that avoids the messy and unprincipled inquiries into mosaics of data.

You may also like...

1 Response

  1. RENEE says:

    As you mention in this article, companies collect certain biometric data (e.g. Siri(voice), Disney (retna)) and typically these companies secure broad rights to this data via the standard agreements (a seperate but equally important issue/topic) leaving me to wonder how the data that Disney has about me could end up in here. Are there any rules governing how the NGI data is collected? The sources etc?
    Would love to hear your thoughts.