Congress needs to step up its regulatory game and enact some standards of use for facial recognition technology, at least on law enforcement.
That Amazon’s “Rekognition” system just falsely identified 28 faces who serve in Congress as criminals only underscores the dire need for some sort of speedy clampdown.
But fact is, even without this latest ACLU finding — even without the screaming headline “Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots” — facial recognition is an identification tool that carries too much potential for false positives.
Case in point: In 2016, a Denver man named Steve Talley was arrested on two separate occasions for bank robberies that were actually conducted by someone with similar facial features. How did police find Talley? By feeding video images of the suspect’s face into databases.
Police, of course, say the technology helps them secure their communities.
Detectives in Arizona this past June tracked down and arrested a 43-year-old Maniac Latin Disciples gang member who was wanted for weapons and drugs violations, and assault. Police in West Palm Beach in November of 2016 apprehended a woman for identity and bank account thefts based on data from an artificial intelligence program.
And the positives of facial scanning — the benefit to law enforcement, intelligence, border patrol and national security agencies alike — can’t be dismissed.
But neither can the negatives.
Facial scanning comes with the potential for bias, and the algorithms that make up the technology give more false positives for women and blacks than for men. This same “Rekognition” technology the ACLU just tested and found to misidentify 28 members of Congress as criminals has already made its way into the city of Orlando for police use, The Verge reports. It’s already found a home in a sheriff’s department in Oregon, the ACLU reports.
That means the potential for more misidentification is growing.
If Congress would at least slow the facial scanning roll on police and enact provisions that hold law enforcement accountable for mistaken identities — by say, legislating the clear rights of victims to sue — much of the civil rights’ resistance to this technology would disappear. Congress could also pave the way toward more transparency by requiring law enforcement agencies that use facial recognition to open their data collecting A.I. practices to public scrutiny, and therefore, more oversight.
The technology companies, bloated with investor cash and fueled by a competitive spirit, aren’t likely to self-govern or quit marketing to law enforcement. Law enforcement and police-friendly local governments probably aren’t going to voluntarily give up their controls, either.
It’s going to be left for the federal government to do.
Perhaps now that the mistaken identities have hit at Congress’s own door, lawmakers might be a bit more willing to get in this regulatory ring.
First appeared at The Washington Times.