Watch CBS News

Facial Recognition Use By North Texas Police Grows Along With Privacy Concerns

DALLAS (CBSDFW.COM)  - Facial recognition has become mainstream.

The technology is used in airports, stores and has become a standard feature on newer smartphones.

But as the use of the technology increases, so do privacy concerns.

The CBS 11 News I-Team discovered facial recognition is used by more than a half dozen North Texas law enforcement agencies, in some cases with little public awareness.

facial recognition technology
facial recognition technology (CBS 11)

Local law enforcement agencies said they use the technology as a tool to generate leads in cases that otherwise might turn cold.

In November, Plano Police used its facial recognition system to identify a suspected shoplifter caught on security cameras at Neiman Marcus stealing high-end wallets.

Arlington Police used the technology in August 2018 to identify a wanted felon who was staying at a local motel.

But while detectives point to the technology's effectiveness in generating leads, law enforcement agencies also note its imperfections. At times, the technology can trigger an incorrect match.

"It's not perfect, but it does work," said Senior Detective Kevin Burkett, who runs the Irving Police Department's facial recognition system.

Burkett said the technology has helped his department solve dozens of crimes from shoplifting to a recent murder.

But the Irving detective said there are many misconceptions about the technology.

When the departments runs a search in its system using a photograph or a still frame from a video, the result is not a single match. Instead, it's a list of the 200 closest matches based on a series of complex algorithms using thousands of facial points.

The 200 possible matches are ranked and given a score based on how closely they resemble the original picture.

Det. Burkett said he then personally looks at every potential match before handing off any potential leads to detectives.

"The limitation on the technology is mainly your databases," Det. Burkett explained. "The more photos you have the better the chance you have at finding the criminal that committed the offense."

Brian New with Irving Police Detective KEvin Burkett
Brian New with Irving Police Detective KEvin Burkett (CBS 11)

The database the Irving police department uses is comprised of mug pictures taken from the Irving Jail. The department is looking to expand its database to include mug pictures from neighboring police departments who also use the technology.

While most local police departments use just mug pictures, the Texas Department of Public Safety uses more than 24 million drivers license photos in its facial recognition database.

Privacy advocates say this means the potential of a false match can happen to nearly anyone in Texas.

"Many people who think they have never been in contact with law enforcement or that law enforcement would not have their information are part of this flip book of people," said Kali Cohn, a Dallas based attorney with the American Civil Liberties Union (ACLU).

sample Texas drivers license
Sample Texas driver's license. (credit: CBS 11 News)

Texas law on how law enforcement can use facial recognition technology is vague and department policies vary greatly.

Some, like the Irving Police Department, have a detailed written policy that require reasonable suspicion or consent before the technology can be used.

But other local departments with the technology told the I-Team it currently does not have any policy for its use.

Privacy advocates also worry about the technology's accuracy.

"Research has been done on those algorithms and computer programs and we see that they are often times unreliable, particularly unreliable for people of color," Cohn said.

Kali Cohn
Kali Cohn - ACLU (CBS 11)

Last year, the ACLU conducted a test on Amazon's facial recognition technology, comparing images of members of Congress with a database of criminal mug shots.

The software incorrectly matched 28 members of Congress as people who had been arrested. The false matches were disproportionately of people of color.

"If a system cannot identify faces of color as reliably as faces of white people, that's a big problem" Cohn said.

In response to the ACLU test, an Amazon spokesperson said the ACLU used an 80 percent confidence setting for its test when a higher setting should have been used.

Tech companies say recent advances in the technology have improved accuracy, but privacy concerns remain as use by law enforcement grows.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.