The FBI wants to exempt its next-level policing software from public scrutiny
After eight years working on a massive project with widespread implications on privacy in America, the FBI has announced that it wants its new policing software—which is already being rolled out to departments across the country—to be exempt from a federal privacy law. Further, it is giving the public only 21 days to research and comment on the matter, enraging many civil-rights groups.
The new system, known as NGI, is a federal mishmash of the latest biometric technologies. It incorporates facial recognition, iris recognition, palm printing, and fingerprinting, among other things, into a searchable federal database that is already shared with over 20,000 law enforcement agencies across the country. Over 120 million records are already stored in the database.
For years, the FBI has been compiling this data and distributing the technology without issuing what is called a “Systems of Records Notice,” which is required by the Privacy Act, a federal law that governs the collection of some private information. The notice is required for any system that collects and uses Americans’ personal information, describing not only how the data is used, but how it is protected. Yet it was only last month that it put the notice up in the Federal Register, along with its request for Privacy Act exemption.
“The FBI waited over half a decade to publish a basic privacy notice about NGI. Now, the American people have 21 business days to comment on that system–and the FBI’s request to make most of it secret,” reads a letter signed by several civil-rights groups. “This is far too little time.”
Specifically, the letter—signed by groups such as the American Civil Liberties Union, the American Arab Institute, the National Immigration Justice Center and others—requests 30 extra days to comment on the proposal. One of the main points of contention is who the policy stands likely to affect the most, and how the removal of privacy protections might negatively impact them.
“The system it affects is extraordinarily sensitive–particularly for the communities it may affect the most.”
“[The database] likely includes a disproportionate number of African Americans, Latinos, and immigrants,” it reads. “This is an extraordinarily broad proposal, and the system it affects is extraordinarily sensitive–particularly for the communities it may affect the most.”
Worse, they say, some of the core technology at the center of the new system is notoriously flawed when it comes to interacting with black and brown populations. Even those companies at the forefront of facial recognition technology have time and time again been proven to have a harder time recognizing darker people’s faces.
Last year, Google Photos made a huge blunder when its newly released technology was wrongly identifying black people’s faces as gorillas. In 2009, Hewlett-Packart’s Mediasmart webcam was unable to even identify a black person’s face as a face, as seen in the viral video below. Same happened with Microsoft’s Kinetic device in 2010.
But at least part of the problem might be the tech bubbles that this software is being developed within. A 2011 study found that software built in east Asia was better at recognizing east Asian faces, while software developed in North America and Europe is better at recognizing white faces.
The study likened this reality to the cringeworthy “they all look the same to me” comments that we sometimes hear people say when talking about people of a race different than their own. The findings suggest that “our ability to perceive the unique identity of other-race faces is limited relative to our ability to perceive the unique identity of faces of our own race,” the researchers wrote.
Somehow, that inherent bias is getting transferred into software code.
The FBI insists that the information in the NGI won’t be used to positively identify anyone, but rather that it is meant to produce a ranked list of suspects in criminal cases. As such, it claims that “there is no false negative rate” for the technology, as it told the Electronic Frontier Foundation, one of the signatories of the letter, back in 2014.
Even considering this, there’s still “a very good chance that an innocent person will be put forward as a suspect for a crime just because their image is in NGI—and an even better chance this person will be a person of color,” noted the EFF in a blog post published alongside yesterday’s letter.
The FBI crime data is already flawed in ways that can affect those who have come in contact with the system. In a 2013 report, the National Employment Law Project found that about half of the FBI records are missing information on whether the person was ultimately convicted or acquitted of a crime they were accused of committing. In many cases the report detailed, this has led to people being denied employment based on arrests for crimes they were later cleared from. Because of disproportionate arrest rates, these issues primarily affect people of color.
Without the protections of the Privacy Act, Tuesday’s letter claims, “private citizens could never take [the FBI] to court” to correct the inaccuracies in the NGI data. Neither would the feds have to share where they are getting the data from (private security cameras? state-issued drivers license photos? employer-mandated background checks?). Nor would they have to share how the data they collect on us is being used.
Communications between the federal government and other law enforcement agencies have long been mired with systemic problems. Overall, it’s probably a good thing that the FBI is heading this effort, using cutting edge technology and putting all levels of law enforcement data into the same information sharing system.
But that progress shouldn’t have to come at the expense of transparency, or of the government having to tell us how they are using the information they collect. Especially when the FBI has in the past mentioned that it would like to deploy the technologies at “critical events,” which some suggest might include First Amendment-protected political rallies.
It’s taken eight years to get a proper privacy notice published about this new frontier of policing technologies. At the very least, it’s worthy of another month of our attention.
Daniel Rivero is a producer/reporter for Fusion who focuses on police and justice issues. He also skateboards, does a bunch of arts related things on his off time, and likes Cuban coffee.