A new study suggests Amazon's Rekognition software sucks at detecting dark-skinned women

click to enlarge A new study suggests Amazon's Rekognition software sucks at detecting dark-skinned women
Photo by Joey Roulette
Amazon's facial Rekognition software, a brand which is being tested in Orlando as a surveillance system, misidentifies darker-skinned women as men roughly a third of the time, according to a new study.

In tests led by MIT Media Lab researcher Joy Buolamwini and University of Toronto researcher Deborah Raji, the study published Thursday looked at face-scanning software from Amazon, Microsoft, IBM , Face++ and the Florida-based company Kairos.

Amazon's facial analysis software made no errors in identifying the faces of lighter-skinned men. But it misidentified the gender of lighter-skinned women about 7 percent of the time. Darker-skinned women were mistaken for men about 31 percent of the time. Overall, the software classified women as men roughly 19 percent of the time.

Rekognition's facial analysis tool, which senses face traits and expressions, is different from its facial recognition component, which actually identifies people by matching a photo to a predefined database of images, according to the company.

The City of Orlando is currently testing Amazon's facial recognition software in a manner unprecedented among American law enforcement agencies. The technology plugs into the city's public street camera network and scans everyone around in an attempt to identify and track a person of interest in real-time. The Orlando Police Department hopes the surveillance software will help them catch criminals, find missing children and quickly respond to threats. The city is piloting Rekognition on a small number of cameras using the faces of seven volunteer police officers.

When asked about the study, city officials reiterated the software is "not currently using any images of the public and it is not being used in an investigative capacity."

"The purpose of piloting the software is to determine accuracy and functionality of the technology and evaluate if it can be a useful tool in the future," city spokesperson Heather Fagan said.
A new study suggests Amazon's Rekognition software sucks at detecting dark-skinned women
Photo via Joy Buolamwini
Out of all the systems analyzed in the study, Amazon performed the worst. Kairos, a software company located in Miami, tested slightly better than Amazon – it misidentified darker-skinned women for men about 23 percent of the time. Kairos CEO Melissa Doval says the study used an older algorithm available in August 2018, but the company released a new algorithm in October that continues to improve.

"It was actually Joy’s first study 'Gender Shades' that catalyzed our commitment to help the industry solve the bias problems, and inspired us to completely redesign our face recognition models," Doval says. "We'd love to collaborate with Joy and her team on testing a newer version of Kairos, and learn how we can continue to improve the technology."

Matt Wood, a machine-learning engineer at Amazon, slammed the MIT study on Twitter, saying the study only tested facial analysis and ignored facial recognition. Amazon did its own version of the test with an up-to-date version of Rekognition and a similar data set of images download from parliamentary websites and the MegaFace data set of 1 million images. Wood says that counter to the study, Amazon found "exactly zero false positive matches with the recommended 99 [percent] confidence threshold."


"It's not possible to draw a conclusion on the accuracy of facial recognition for any use case – including law enforcement – based on results obtained using facial analysis," Wood wrote. "The results in the paper also do not use the latest version of Rekognition and do not represent how a customer would use the service today. … We continue to seek input and feedback to constantly improve this technology, and support the creation of third party evaluations, datasets, and benchmarks."

Similar studies have zeroed in on other face-scanning ventures like those by Face++ and Microsoft, which has since launched its own ethics-based campaign recommending government regulation over the use of facial recognition.

"We noted the need for broader study and discussion of these issues. In the ensuing months, we've been pursuing these issues further, talking with technologists, companies, civil society groups, academics and public officials around the world," Microsoft president Brad Smith said in a July 2018 statement. "The only effective way to manage the use of technology by a government is for the government proactively to manage this use itself."

Buolamwini, co-author of the MIT study, wrote in a Medium post Friday morning that Amazon engages in a "denial, deflection and delay" approach in addressing calls for government regulation and third-party studies of facial recognition.

"We cannot rely on Amazon to police itself or provide unregulated and unproven technology to police or government agencies," Buolamwini wrote.

In a response to Orlando Weekly, Buolamwini added that because "the research on bias in facial technology has been out for over a year, it is disturbing to see persisting racial and gender bias in [Amazon's] systems."
Buolamwini and her co-researcher Raji also note in their paper that while facial analysis systems may be improved to reduce racial and gender bias, "algorithmic justice" necessitates oversight and regulation of this technology.

"The potential for weaponization and abuse of facial analysis technologies cannot be ignored nor the threats to privacy or breaches of civil liberties diminished even as accuracy disparities decrease," the researchers wrote. "More extensive explorations of policy, corporate practice and ethical guidelines is thus needed to ensure vulnerable and marginalized populations are protected and not harmed as this technology evolves."


While there is currently no state or federal laws governing facial recognition, the technology is slowly making its way on the agenda for Florida lawmakers, who just this week heard their first briefing on facial recognition surveillance during a presentation to the state House Criminal Justice Subcommittee.

Pinellas County Sheriff Bob Gualtieri, whose Tampa-area agency employs FR-Net – the country's largest law enforcement-based facial recognition network – highlighted the difference between after-the-fact recognition software like FR-Net and the capabilities of real-time software like Amazon's Rekognition. Neither Gualtieri nor any lawmaker in the presentation specifically mentioned Amazon by name but alluded to the program.

"That's the issue," Gualtieri said. "The random collecting of those images is what gives most people the pause, the concern, the angst and the lack of comfort with it. … I don't think it's the right thing to do." 

Orlando city officials will decide whether or not to fully acquire the Amazon's facial recognition technology in April.

Stay on top of Orlando news and views. Sign up for our weekly Headlines newsletter.
Scroll to read more Orlando Area News articles

Newsletters

Join Orlando Weekly Newsletters

Subscribe now to get the latest news delivered right to your inbox.