Under Orlando’s real-time surveillance partnership with Amazon, everyone’s a suspect 

All eyes on us

Page 5 of 7

click to enlarge A recently replaced camera in downtown Orlando - PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette
  • A recently replaced camera in downtown Orlando

Mina says his officers have to disclose on their reports if they used FACES to identify a suspect. He's not sure how many searches a month his department runs through the database.

"Say we do get a match – we don't use that match for probable cause," he says. "If we got what we thought was a match, we would use other investigative means to make sure that person was identified correctly."

Mina acknowledges that there's a "big difference" between the two facial recognition technologies. "FACES is for after-the-fact crimes committed, and it's a still image," he says. "The Amazon Rekognition program is real-time. If a suspect, someone with a warrant, or a missing child is walking down the road in real time, what we hope is that it will recognize that face and alert us."

Despite the lure of potentially saving lives, Brian Brackeen refuses to sell his tech to the cops.

Brackeen is the founder and CEO of the Miami-based company Kairos, which develops facial recognition software for private entities and specializes in emotion detection. He's sold to banks and political groups, but not the government.

A few times, Miami-Dade County officials came to him for facial recognition software that could help catch child predators or keep the Super Bowl safe, he says. But Brackeen turned them down because the risks are too great – operators often aren't given anti-bias training and few internal checks exist.

"We get facial recognition right more than 99 percent of the time," he says. "When we don't get it right, a customer might have to scan their face again at the checkout of a business. It's not life or death. The problem with government facial recognition is that people can lose their lives when things go wrong."

Brackeen, one of the few black founders of a facial recognition company, adds that existing facial recognition software has not been exposed to enough images of people of color to confidently identify them.

The risk of misidentification spans all brands, including Rekognition. And having police officers act on a first match in the event of another Markeith Loyd, as Mina envisions, could lead to wrongful arrests or even deaths, critics warn.

As Matt Cagle, a civil liberties attorney at the ACLU, puts it: "There may be a misidentification that results in a law enforcement encounter that should never have happened, or the use of force based on an inaccurate scan of somebody's face. The burden is really on Orlando and Amazon to show that this technology actually works, because right now we just don't have the public evidence to demonstrate that for Rekognition."

On July 26, the ACLU published a study showing that Rekognition incorrectly matched – at the default confidence threshold of 80 percent – 28 members of Congress with other people who have been arrested for a crime when their portraits were scanned through a database of 25,000 mugshots. The mismatches were disproportionately of people of color, including civil rights legend U.S. Rep. John Lewis.

In addition, the ACLU obtained a training PowerPoint used by Washington County that featured a screengrab of the Rekognition software showing an uploaded test photo of O.J. Simpson matching a white man's mugshot. Despite clear differences in hair and facial complexion, the results came back as a "93.53 percent match." An annotation below the screengrab reads, "As you can see despite the high percentage value returned, it still requires human interpretation to determine if there is an actual match."

In a statement, Amazon said the ACLU's study could have been improved by following "best practices" in setting confidence thresholds, which is the percentage likelihood that Rekognition found a match.

"While 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," the statement said. "When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95 percent or higher."

Later, an artificial intelligence engineer from Amazon, Matt Wood, called the study a misinterpretation of the software's capabilities. Amazon now said it recommended law enforcement agencies using Rekognition set the confidence threshold at 99 percent.

Wood, though, acknowledged the need for some types of regulation on the technology. "We should not throw away the oven because the temperature could be set wrong and burn the pizza," Wood wrote in a statement on Amazon's website. "It is a very reasonable idea, however, for the government to weigh in and specify what temperature (or confidence levels) it wants law enforcement agencies to meet to assist in their public safety work."

Tags: ,


Never miss a beat

Sign Up Now

Subscribe now to get the latest news delivered right to your inbox.


© 2018 Orlando Weekly

Website powered by Foundation