Amazon is drawing heat on multiple fronts as a result of its facial-recognition technology. Days after being blasted by civil-liberty advocates, the retailer found itself at odds with a city pilot-testing the software.
Records publicized by the American Civil Liberties Union on Tuesday showed the Orlando (Florida) Police Department was trying out Amazon's facial-recognition service, dubbed "Rekognition." Stories, including one by CBS MoneyWatch, included references to a video showing Ranju Das, director of Rekognition at Amazon, describing how Orlando was using the service.
"They have cameras all over the city. The authorized cameras are then streaming the data," he said. "We analyze the video in real time, [and] search against the collection of faces they have."
The depiction didn't go over so well in Orlando, leading city officials to scramble to contain the damage.
In a statement to the Orlando Sentinel, an Orlando police spokesman said the department's use of Rekognition is limited to eight city-owned cameras and that no images from the public are being used in the pilot.
"To be clear, this partnership with Amazon includes testing to see if the technology even works," the spokesperson told the newspaper. "As it is still very early on in this process, we have no data that supports or does not support that the Rekognition technology works."
The video featuring Das at a recent developer's conference in South Korea now includes the following clarification from Amazon:
"Between minutes 31:29 and 32:19 of this video, an Amazon Web Services (AWS) spokesperson got confused and misspoke about the City of Orlando's use of AWS technologies. The City of Orlando is testing Amazon's Rekognition Video and Amazon Kinesis Video Streams internally to find ways to increase public safety and operational efficiency, but it's not correct that they've installed cameras all over the city or are using in production. We apologize for any misunderstanding."
An Amazon spokesperson did not immediately respond to a request for comment.