Facial-Recognition Technology Wrongly Flagged 26 California Lawmakers As Criminals

SAN FRANCISCO (KPIX 5) – A recent test of Amazon's facial recognition technology reportedly ended with a major fail, as some state lawmakers turned up as suspected criminals.

The test performed by the American Civil Liberties Union screened 120 California lawmakers' images against a database of 25,000 mugshots. Twenty-six of the lawmakers were wrongly identified as suspects.

The ACLU said the findings show the need to block law enforcement from using this technology in officers' body cameras. Meanwhile, supporters of facial recognition say police could use the technology to help alert officers to criminals, especially at large events.

Assemblymember Phil Ting (D-San Francisco) was among those falsely identified.

"While we can laugh about it as legislators, it's no laughing matter if you are an individual who is trying to get a job, for an individual trying to get a home," Ting said at a news conference on Tuesday. "If you get falsely accused of an arrest, what happens? It could impact your ability to get employment."

Ting has introduced AB1215, a bill that would ban police departments from using any facial recognition technology on body cameras. The bill has passed the Assembly and is now in the State Senate.

Amazon sent a statement to KPIX 5, which read in part, "... facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking ..."

The tech giant said it continues to advocate for federal legislation of facial recognition technology to ensure responsible use and accused the ACLU of misusing and misrepresenting its technology to make headlines.

Read more
f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.