As government agencies continue to push for the deployment of facial recognition systems, you needn’t look far to see why that’s bad news. To illustrate the point, the ACLU conducted a test of Amazon’s Rekognition software — facial recognition tech currently being used by US law enforcement — in which it incorrectly identified 26 California lawmakers as matches in a criminal database. We’ll pause while you chuckle at the “politicians are criminals” jokes running through your head. It’s the second time the ACLU has run this type of test. In the first, a test conducted last year, Rekognition was wildly…

This story continues at The Next Web

Or just read more coverage about: Amazon

Amazon’s facial recognition mistakenly labels 26 California lawmakers as criminals
Source: The Next Web