A coalition that includes the ACLU, EFF, and 50 other organizations has asked the Department of Justice to investigate how the FBI and police are using large-scale facial recognition databases in criminal investigations. The letter comes alongside a new report that claims around half of American adults are effectively part of these databases.
The report, released by the Georgetown Law Center on Privacy and Technology, draws on both existing data and material obtained through public records requests. It notes that at least 26 states — which have been previously identified — let law enforcement scan photos from the Department of Motor Vehicles as part of investigations. Based on the number of drivers who have received licenses in each state, the study’s authors calculate that this covers 117 million adults — or 48 percent of the total adult population. The licenses aren’t part of one central index, but several databases across states.
Searching through DMV photos of people with no police record is a significant expansion from scanning a criminal database. It’s also different from the DMV’s own scanning, which looks for duplicate faces on licenses to detect fraud. But the report concludes that no states have comprehensively regulated how to do it. This means there’s little oversight of how police use these systems, how well they work, and whether they’re disproportionately affecting communities of color. The problem becomes more pronounced when talking about cutting-edge technologies that are just starting to see use, like live video streams that can identify people who walk by the camera.
The report’s authors stress that they aren’t categorically against facial recognition. “This is not a report that calls for a general ban on all facial recognition,” says co-author and department executive director Alvaro Bedoya. But among other things, the group wants to see states adopt legislation covering how it’s used. These include requiring a “reasonable suspicion” of criminal conduct before searching databases, requiring court orders for searches, and establishing regular audits of systems’ performance. They suggest a moratorium on searching driver’s license photos until state legislatures vote on the issue, applying to both local police and the FBI. The report includes a model bill that both state and federal lawmakers could adopt.
Facial recognition algorithms are becoming increasingly powerful and capable of sorting through huge amounts of data. But they remain fallible, especially when the photo that investigators hope to match with a driver’s license or mugshot was taken in poor light or at an angle. They also appear to have inadvertent biases. A 2012 study, for example, found that major algorithms were 5 to 10 percent less accurate on African-Americans than on white subjects. Combined with existing trends of policing minority communities more closely, this could increase the chance that innocent black Americans will get caught in the criminal justice system. But because of low transparency and inadequate auditing, we can’t actually tell if this is happening.
Besides the possibility for mistakes, there are larger philosophical questions about how much power law enforcement agencies should have to identify bystanders, and when they should be able to do so. Baltimore police, for example, were found using a social media surveillance tool — which included limited facial recognition — to identify and monitor protesters, which could have chilling effects on free speech. Today, livestream-analyzing tools can only search through a small group of faces and match them in a crowd. As processing power increases, though, they could hypothetically identify anyone with a driver’s license almost instantly — at least in states that have granted access. As existing technology advances, drawing a line between policing and invasive surveillance will be an unavoidable part of the debate over facial recognition.
In addition to the findings, an interactive map and state-by-state scorecard can be found on a website dedicated to the report.