
In a quarterly earnings call Tuesday, Axon CEO Rick Smith told investors that the company does not have a timeline for launching facial recognition in its products because the software available does not meet necessary "accuracy thresholds."
"We do not have a team actively developing it," Smith said. "This is a technology we don't believe that – sitting here today – the accuracy thresholds are right where they need to be to [make] operational decisions off of facial recognition."
As Orlando Weekly reported this week, studies have shown facial recognition systems make mistakes – and advocates fear those errors could lead to innocent people being misidentified and wrongfully detained by police. A study from MIT analyzing algorithmic bias in facial recognition software from Microsoft, IBM and Face++ found that across all brands, there is an average error rate of 0.5 percent for light-skinned men compared to an error rate of 30 percent for dark-skinned women. Joy Buolamwini, a researcher who conducted the MIT study, also found that Amazon's Rekognition software performs better on lighter-skinned faces than darker-skinned faces with an accuracy difference of 11.45 percent, and performs better on male faces than female faces with an accuracy difference of 16.47 percent. Critics have attributed the substantial gap to the unconscious bias displayed during software testing by developers who are mostly white men.
On the Tuesday call with investors, Smith said once the technology's accuracy thresholds improved and the company had a "tight understanding" of privacy and accountability controls needed for the public to accept facial recognition, then they could move to the commercialization phase.
"You don't want to be premature and end up where you have technical failures with disastrous outcomes or something where there is some unintended use case where it ends up then being unacceptable publicly and impaired for long term of the technology," he said.
Fast Company reports that in May, Axon (formerly known as TASER International) won a patent for technology that can identify people and objects in near real-time from body cam or drone video. "Once a face is captured by a user’s body-worn camera, say patent filings, a hand-held device 'provides the name of the person to the user of the capture system,'" according to Fast Company. The company created an ethics board to help guide the artificial intelligence that could be powering Axon's police products.
Civil rights groups, including the NAACP and ACLU, wrote a letter to Axon's ethics board saying that the company had a responsibility to ensure its products "don’t drive unfair or unethical outcomes or amplify racial inequities in policing." The organizations contend that certain products are "categorically unethical" to deploy, especially the use of real-time face recognition analysis of live video captured by body-worn cameras.
"Real-time face recognition would chill the constitutional freedoms
of speech and association, especially at political protests," the letter said. "Research indicates that face recognition technology will never be perfectly
accurate and reliable, and that accuracy rates are likely to differ based on
subjects’ race and gender. Real-time face recognition therefore would inevitably misidentify some innocent civilians as suspects. These errors could have fatal consequences – consequences that fall disproportionately on certain populations."
Orlando is currently testing Amazon's Rekognition program on a small number of cameras throughout the city with seven Orlando Police officers who have volunteered to have their photos uploaded to the system and walk by the cameras to be identified.
If and when Rekognition becomes fully operational, OPD will upload the photo of a "person of interest" into a database that the software turns into a unique biometric. Using this data, the software taps into Orlando's network of surveillance cameras around the city and looks for a match by essentially scanning everyone it can see.
City officials, who see facial recognition technology as nothing new, hope Amazon's software will help them catch criminals, find missing children and identify public threats quickly. Civil liberties advocates, though, worry about Orlando and Amazon heading toward a slippery slope into a dystopian world of government surveillance with this advanced technology.
"Everybody is a suspect," said Jennifer Lynch, a senior staff attorney with the Electronic Frontier Foundation, in an interview with OW. "If facial recognition is becoming a new norm in the way Orlando is talking about, then we should all be very afraid."
Stay on top of Orlando news and views. Sign up for our weekly Headlines newsletter.