An editor's note on our report, "Facial Recognition"

In response to our story "Facial Recognition" (airdate May 16, 2021) -- about law enforcement's use of facial recognition technology to identify suspects -- we heard from some viewers who believe we should have included the work of computer scientist Joy Buolamwini and the organization she founded, the Algorithmic Justice League, regarding algorithmic bias. Ms. Buolamwini has also been in touch with us on this same issue.

While we did not interview Ms. Buolamwini on camera for this segment, she was an important part of the research we did due to her work in this field. We were last in touch with Ms. Buolamwini in February. As we continued reporting, we sharpened our focus on the facial recognition technology that law enforcement is currently using to identify suspects and make arrests and, crucially, on the lack of well-established national guidelines around the technology's use. We also emphasized the human role in this process -- police personnel tasked to interpret facial recognition results -- as well as two men who said they were wrongfully arrested.  That alone took the full time we had to tell this story clearly and fairly. 

That being said, we are very grateful to the dozens of sources – off and on camera – who helped us develop and focus this segment but were not mentioned by name. As with all our reporting, we spoke with a wide range of people, including some of the leading thinkers and researchers in the field, like Ms. Buolamwini.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.