Amazon’s new facial recognition software is being tested by law enforcement in several US states. The technology according to Amazon is designed to help the police identify suspects in a bid to solve crime cases faster. However, it turns out that the software may not be as powerful as it first seemed. Reps John Lewis of Georgia, Bobby L. Rush of Illinois and 26 more lawmakers are at the center of these new revelations. Both are Democrats, members of the Congressional Black Caucus, and also civil rights activists. However, the facial recognition technology developed by Amazon mistakenly identified them as people who had actually been arrested for crimes.
Amazon’s technology has been used by law enforcement and other organizations, and these latest revelations won’t do its reputation any good. The claims were made by the American Civil Liberties Union on Thursday morning. The errors emerged in a series of tests done by the ACLU. The organization compared the photos of all lawmakers against a database of about 25,000 publically available mugshots. The software matched 28 members of Congress with people who had been arrested before. This represented a 5% error rate. However, what made the tests even more controversial was the fact that Latinos and African American members of Congress were disproportionately misidentified compared to their white counterparts.
The ACLU concluded from the test that indeed the facial recognition technology isn’t only “flawed” but also “biased and dangerous.” Nonetheless, Amazon responded with a comment later in the day saying that the tests applied by the ACLU were in fact against the recommendations that the e-commerce giant had provided for law enforcement agencies adopting the technology.
According to Amazon, the facial recognition software has been used for many beneficial purposes like preventing human trafficking and reuniting children with their families.
Amazon says that in the real world the technology is not used to make autonomous decisions on behalf of the police. The software simply narrows down a list of suspects for humans to review the information before any action is taken. In addition to this, Amazon said that the ACLU tests used default settings for matches. This made it possible for matches that had at least an 80% resemblance to the mug shots to be reported. The tech giant added that police departments that have adopted the technology have been advised to use a 95% match score.
The facial recognition technology by Amazon has come under sharp criticism from privacy experts, especially after police departments started to test it. Although supporters of the technology view it as a useful tool that may be used to help identify suspects, opponents claim that it’s a surveillance system that’s open to many abuses.
ACLU together with nearly other 24 civil liberty organizations wrote a letter to Amazon’s CEO Jeff Bezos. The letter demanded that Amazon stops selling its technology to law enforcement. The lobbyist argued that the technology could easily be used to track protesters and undocumented immigrants too. However, Amazon has maintained that there hasn’t been any reason to suggest so far that law enforcement has abused this new technology.