The FBI’s Next Generation Identification system has only been fully operational for two months, having been activated in early September, and recently a federal judge has ruled that the NGI’s facial recognition database is deserving of open-government advocate scrutiny.
According to an article for National Journal, U.S. District Judge Tanya Chutkan says the NGI program should be the subject of significant transparency oversight, as it poses a potential threat to privacy rights. Chutkan ruled in favor of the Electronic Privacy Information Center (EPIC) last week, validating a Freedom of Information Act lawsuit the organization filed last year.
EPIC wanted to make a 2010 government report detailing the NGI biometric database public, and in addition to its validation has now been awarded $20,000 in legal fees.
The primary argument for transparency and public scrutiny of large facial biometrics databases like the NGI’s has to do with accuracy rates and potential instances of abuse. This is despite the FBI’s assurance that the technology will only be used to catch wanted persons.
In many instances the FBI has described the intended function of its facial biometrics database. The Interstate Photo System, which went online as one of the latest NGI enhancements is kept separate from other functions like RapBack, which is used to provide employers with notifications if criminal activity has been reported on persons holding public trust.
Despite these assurances, privacy advocates worry that NGI-like technology can be used to collect images and survey innocent people. Combine that with a distrust of the technology’s accuracy (the five year old report cites a 20 percent rate of failure) and you have what must be a scary picture for Americans worried about privacy.
Facial recognition has been the biometric modality to receive the brunt of privacy fears. Famously, Senator Al Franken has been vocal with his concerns surrounding the use of facial recognition tech and mobility. Privacy concerns have stretched to other biometrics that can be passively collected like voiceprints, underlying the need for public transparency, a robust ecosystem of standards and practices, and education on how biometric technology is used.
November 10, 2014 – by Peter B. Counter