FBI Wants to Remove Privacy Protections From Its Massive Biometrics Database — NGI Disproportionately Impacts People of Color
NGI does not affect everyone equally. Thanks to years of well-documented racially biased police practices, the system includes a disproportionate number of African Americans, Latinos, and immigrants. Face recognition — NGI’s cornerstone biometric technology — is notoriously inaccurate across the board. (According to the FBI, NGI may produce a false match — indicating someone is a suspect for a crime they didn’t commit — at least 15% of the time). But research suggests that face recognition may also misidentify African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively. So even though FBI says NGI’s face recognition isn’t designed to positively identify anyone (it produces a ranked list of possible candidates), there’s a very good chance that an innocent person will be put forward as a suspect for a crime just because their image is in NGI — and an even better chance this person will be a person of color.
NGI’s disparate impact is not limited to facial recognition inaccuracy because FBI records as a whole are also notoriously unreliable. At least 30 percent of people arrested are never charged with or convicted of any crime. But according to the National Employment Law Project, as much as 50 percent of the FBI’s arrest records fail to include information on the final disposition of the case — whether a person was convicted, acquitted, or if charges against them were dropped. If these arrest records aren’t updated with final disposition information, hundreds of thousands of Americans searching for jobs could be prejudiced and lose work. And due to disproportionately high arrest rates, this uniquely impacts people of color.