Police and government agencies are becoming increasingly reliant on facial recognition software to hunt for suspects, despite the fact that this technology is notorious for falsely identifying innocent people as criminals.
In one of the most recent cases, Amazon’s facial recognition technology falsely identified 27 different professional athletes as criminals. While it may be true that professional football players have been known to get in trouble from time to time, this is obviously a failure of the software.
RELATED STORY:
The software is called “Rekognition,” and it has made numerous high profile mistakes since its launch in 2016. This is the same facial recognition system that accidentally identified one in five California lawmakers as criminals, although depending on your perspective on politics, you might think that they are all criminals.
The Massachusetts chapter of the ACLU tested the Rekognition system by scanning the headshots of 188 professional athletes from different New England sports teams, including the Boston Bruins, the Boston Celtics, the New England Patriots, and the Boston Red Sox. The headshots were fed through a database of 20,000 mugshots, and the software determined matches for 27 of these players.
RELATED STORY:
Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts, said that facial recognition is dangerous whether it works or not.
“The results of this scan add to the mounting evidence that unregulated face surveillance technology in the hands of government agencies is a serious threat to individual rights, due process, and democratic freedoms. Face surveillance is dangerous when it doesn’t work, and when it does,” Crockford said in a statement.
RELATED STORY:
Amazon Web Services is standing by its technology, claiming that the ACLU chapter was “misrepresenting” its product.
“As we’ve said many times in the past, when used with the recommended 99 percent confidence threshold and as one part of a human-driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking. We continue to advocate for federal legislation of facial recognition technology to ensure responsible use,” a statement from the company said.
According to a report from the Guardian, the South Wales Police scanned the crowd of more than 170,000 people who attended the 2017 Champions League final soccer match in Cardiff and falsely identified thousands of innocent people. The cameras identified 2,470 people as criminals but 2,297 of them were innocent, and only 173 of them were criminals, which is a 92 percent false-positive rate.
RELATED STORY:
According to a Freedom of Information request filed by Wired, these are actually typical numbers for the facial recognition software used by the South Wales Police. Data from the department showed that there were false-positive rates of 87 percent and 90 percent for different events.
Similar numbers were released by the FBI in 2016, with the agency also admitting that their facial recognition database consisted of mostly innocent people since they use drivers license and passport photos for their searches, in addition to mug shots. In fact, there is a 50/50 chance that your picture is in a facial recognition database. Also in 2016, another study found that facial recognition software disproportionately targeted people with dark skin.
*Article originally appeared at Waking Times. Reposted with permission.