Under Orlando’s real-time surveillance partnership with Amazon, everyone’s a suspect 

All eyes on us

Page 6 of 7

click to enlarge PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette

Facial recognition technology has an algorithmic bias that Joy Buolamwini, founder of the Algorithmic Justice League, calls the "coded gaze."

"A.I. systems are shaped by the priorities and prejudices – conscious and unconscious – of the people who design them," Buolamwini wrote in June for the New York Times.

During her years as a graduate student at the Massachusetts Institute of Technology, Buolamwini put facial recognition software from Microsoft, IBM and Face++ to the test by leveling each software's matching accuracy based on race and gender. The study found that across all brands, there is an average error rate of 0.5 percent for light-skinned men, compared to an error rate of 30 percent for dark-skinned women – a substantial gap possibly attributable to the unconscious bias displayed by many tech companies in which developers are mostly white and male.

In a later study, she found similar results with Rekognition, which she shared first in a personal letter to Amazon CEO Jeff Bezos and then with Orlando Weekly.

"Amazon Rekognition performs better on lighter-skinned faces than darker-skinned faces with an accuracy difference of 11.45 percent," she wrote to Bezos. "It also performs better on male faces than female faces with an accuracy difference of 16.47 percent."

People in Orlando would not be suspects just because Rekognition scans their faces, Mina argues.

"If you think about police officers on the street right now, they're constantly looking at people in the public, so there's no violation of privacy there," he says. "We've had hundreds of cameras in the city of Orlando for over 15 years recording people. There's no privacy issue if you're out in public."

But isn't there a difference between being glanced at by a cop and being analyzed by an algorithm that is tracking the movements of dozens of people and looking for a match? Mina doesn't think so.

"It's just a matter of technology, right?" he says. "Technology is neutral. It's not good or bad – it's how we use that technology. People use the internet for evil all the time. Law enforcement uses the internet for good all the time. So that's what we're using it for. We're using the technology for good, for public safety."

But some civil libertarians see Rekognition as a step toward what's already happening in other parts of the world. The Chinese government, for example, has created an extensive network of surveillance cameras that use facial recognition technology to not only arrest criminal suspects but also monitor ethnic minorities. Chinese officials are also in the process of creating a "social credit system" to publicly rank the trustworthiness of its 1.3 billion residents by tracking their every behavior.

Tags: ,


Never miss a beat

Sign Up Now

Subscribe now to get the latest news delivered right to your inbox.


© 2019 Orlando Weekly

Website powered by Foundation