Aurora police hope to add facial recognition technology to crime-fighting tools
The Aurora Police Department wants to add facial recognition software to its crime-fighting toolkit. The technology, already used in other Colorado cities, has long been controversial. Civil rights advocates say it's a dangerous step that risks misidentification, privacy violations, and further harm to communities of color.
APD says that facial recognition is simply a modern version of what the public already does when asked to help identify a suspect from a photo.
"We oftentimes put out photographs of unknown folks, and we're hoping the community will recognize them," Aurora Police Commander Chris Poppe said. "We're just going to use software to do the same thing."
The technology would only be used to generate leads in active investigations.
"It would be like someone calling the police department and saying, 'I think I know who did that,'" said Poppe. "We still have a lot of investigative steps that we have to go through to make sure that the lead has some validity."
The proposed system would search two libraries: one of mug shots and another of publicly available photos pulled from open-source social media sites, like Facebook.
"Any kind of open source, meaning that it's public," Poppe said. "Facial recognition would give us an opportunity to compare that unknown image of a suspect with libraries of known images."
He emphasized that photos of "unknown people" submitted during an investigation would not be stored for future searches.
Anaya Robinson, Public Policy Director for the ACLU of Colorado, said there are several risks with facial recognition.
"Facial recognition, historically, has a problem identifying certain populations of people, mainly Black folks. It's not great at accurately identifying women. It's not great at accurately identifying, generally, people of color," Robinson said. "It has some trouble when it comes to people with disabilities, because of height differentials. Misidentification is a huge concern."
Beyond accuracy, Robinson said privacy is at stake.
"If there aren't appropriate checks and balances in place, there are a lot of concerns there. Ultimately, the ACLU would love for facial recognition not to be in the hands of law enforcement," he said. "We've seen historically, the Aurora Police Department not having the best outcomes when it comes to communities of color. When you're using technologies like facial recognition that do have fairly high misidentification rates, those outcomes likely aren't going to get better."
Aurora police stress that safeguards are built in. The state established guidelines for the use of facial recognition and requires accountability reports.
"Not only that, but we have several layers of human review. It is only a tip. We would have to then further substantiate and corroborate that the individual in the image is actually involved in our crime, and build probable cause," he said.
For Robinson, even the best safeguards don't outweigh the risks.
"Facial recognition should never be used without meaningful human review," Robinson said. "But the bigger issue is, while facial recognition might be a tool in the toolbox to identify someone who has already committed a crime, the crime has already been committed. That community is not safer."
Under Colorado law, any law enforcement agency must seek council approval before deploying facial recognition.
APD has already presented its plan once to the City Council's Public Safety Committee and expects more hearings. The department estimates startup costs at about $16,000 in the first year, rising to about $67,000 by year four. It would be paid for out of the department's existing budget.
The Aurora City Council will ultimately decide whether to approve the technology. The ACLU says it will continue pushing for stronger regulation at both the state and local levels - and, ideally, bans.