cover_horizontal.jpg

Under Orlando’s real-time surveillance partnership with Amazon, everyone’s a suspect 

All eyes on us

Since February, with relatively little scrutiny, Orlando leaders have been experimenting with a powerful new technology that could make our city one of the first in the country to run a real-time mass surveillance program. Much has been written about the free partnership with Amazon through which the Orlando Police Department has been testing Rekognition, a facial recognition software whose capabilities and usage are unprecedented among American law enforcement agencies. But few have explored exactly how the software works and how deeply it will reach into our lives.

The program is currently being tested on a small number of cameras, but if and when the program becomes fully operational, city officials hope Rekognition will help them catch criminals, find missing children and identify threats by essentially turning more than a hundred public street cameras into constant face-scanning machines.

In fact, once Rekognition becomes fully operational, its only constraint will be the amount of bandwidth available to run the program on dozens, or even hundreds, of cameras at once. But that's not for a lack of desire on the city's end.

"I can activate as many cameras as I need," says Rosa Akhtarkhavari, Orlando's chief information officer and the driving force behind Rekognition inside City Hall. "But my problem is I need to have that bandwidth to move up. Am I envisioning that we are going to stream all our hundreds of cameras? I don't think that's going be [financially viable]. ... When we need it, we can activate it."

The program will work like this: Orlando Police upload the photo of a "person of interest" into a database. Rekognition analyzes the person's face and turns it into a unique biometric. Using this data, the software taps into Orlando's network of surveillance cameras and looks for a match. If it finds one, it alerts police.

Before identifying that "person of interest," however, Rekognition might scan you.

The scan will probably last less than a second. After determining you weren't a match, it will scan the person strolling behind you. And the couple walking their dog across the street. And the business partners headed back to work from lunch. Without being explicitly told, Rekognition will scrutinize everyone around you until it finds a possible match. Amazon, in fact, says its software can detect up to 100 faces in "challenging crowded photos."

Orlando hasn't fully unleashed this technology yet. But city officials see the surveillance software as a tool to keep residents safe – and certainly as nothing out of the ordinary. After all, they say, facial recognition is everywhere. People use it to unlock their phones. Airports screen passengers to verify their identities. Facebook employs it to alert users when someone posts a photo of them.

"Facial recognition technology is not new," city staff wrote in a July 6 memo to Mayor Buddy Dyer and the City Council. "In fact it has become a relatively normal occurrence in our daily lives."

Even so, civil liberties advocates warn that Orlando and Amazon are sailing into dangerous, uncharted waters.

The way Orlando plans to use Rekognition – as real-time surveillance – is completely different from how other law enforcement agencies are commonly using facial recognition, says Jennifer Lynch, a senior staff attorney with the Electronic Frontier Foundation. Every person who passes in front of a camera is scanned, regardless of prior suspicion of a crime. And multiple studies have shown facial recognition systems make mistakes – particularly on dark-skinned women. Misidentification could lead to innocent people being wrongfully detained by police for crimes they didn't commit, or worse, trigger confusion that leads to a deadly encounter.

What's more, although Orlando has been testing Rekognition for months, city officials still haven't developed rules for the program – though they've promised to do so if it moves forward. And right now, there are no state or federal laws to regulate the use of this technology, which has led some tech companies, including Microsoft, and even police departments that use facial recognition to voice concerns about the potential abuse of this software in the hands of the government.

Privacy advocates, meanwhile, worry about it becoming a slippery slope into a dystopian world of government surveillance.

"Everybody is a suspect," Lynch says. "If facial recognition is becoming a new norm in the way Orlando is talking about, then we should all be very afraid."

click to enlarge Orlando police officers participating as test subjects in the city's Amazon Rekognition pilot program pose for emulated mugshots and photos taken by the Orlando Police Department on March 20, 2018 - COMPOSITION BY JOEY ROULETTE, PHOTOS BY CITY OF ORLANDO VIA RECORDS REQUEST.
  • Composition by Joey Roulette, Photos by City of Orlando via records request.
  • Orlando police officers participating as test subjects in the city's Amazon Rekognition pilot program pose for emulated mugshots and photos taken by the Orlando Police Department on March 20, 2018

How exactly did Orlando stumble into something as potent as Rekognition?

City officials have only said that Amazon approached them. But based on interviews and available public records, the partnership has something to do with the municipality's desire to become a so-called smart city.

Smart cities purport to be urban areas that invest in technology and intelligent design to create sustainable high-quality housing and jobs. Smart Cities Council, the world's largest smart cities network, promotes three core values: livability, workability and sustainability.

"We want to become the world's most intelligent, interconnected and efficient city," Mayor Buddy Dyer said in 2017 after the Smart Cities Council awarded Orlando its Readiness Challenge Grant, which came with "a year's worth of free mentoring, valuable products and services" and "worldwide publicity," according to the council's site. One of the grant's supporting sponsors: Amazon Web Services.

Since 2010, Akhtarkhavari has pioneered Orlando's transition into a smart city, bringing in technologies like cloud computing to City Hall's internal infrastructure.

"Everything we do is to make our city better," she says. "Everything we do is to make our city smarter, more efficient, more effective."

When Amazon offered in August 2017 to let Orlando test Rekognition for free, Akhtarkhavari saw it as another asset to the smart-city agenda. By December, the city entered into its first six-month pilot program to try out the technology with eight surveillance cameras around the city.

The program is housed in the police department's IRIS room, a second-floor hub walled with dozens of screens displaying video feeds from the city's 180 or so security cameras, including the eight feeds being used for the program. Amazon consultants visited OPD in January to outfit the IRIS room with Rekognition-compatible computers, according to a contract signed by city officials.

The software works its magic in the cloud, a virtual Amazon server much like Google Drive or Apple's iCloud. Rekognition plugs into the city's streams of surveillance footage to observe and identify a person of interest. Amazon does not store that video, according to Akhtarkhavari.

To make sure the new software works with every brand of security camera in the city, Akhtarkhavari chose three existing cameras in downtown, one camera at an undisclosed Orlando facility and four cameras at OPD headquarters. (In an interview, Orlando Police Chief John Mina revealed that the undisclosed facility is one of the city's community recreation centers.)

At the onset of the testing period, IT staff uploaded photos of seven police officers who volunteered to take part in the program. Once the eight cameras came online in February, the officers walked in view of each connected camera to test its accuracy in finding a match. The city stresses that OPD is not currently using the technology in an investigative capacity or using images of the public for testing. City officials say members of the public may be scanned as Rekognition works to find the volunteer officers, but that biometric information is not stored.

"When there is a hit for an image, when there is a success, the plan – which we have not seen it working yet – we would receive an email that says, 'From camera A at minute whatever, you have a match with image B,'" Akhtarkhavari says. "This is how we are expecting the proof of concept to work."

Orlando never got to that point during the pilot program, as some of the city's surveillance cameras weren't streaming footage efficiently into Amazon's cloud.

"One of the three types of cameras – a very old one that should be replaced and is in the process of being [replaced] – is just not working," Akhtarkhavari says. "The stream keeps stopping."

The program went largely unnoticed until May 22, when the American Civil Liberties Union published records showing Amazon was marketing its Rekognition technology to law enforcement agencies and already had two clients: the city of Orlando and the Washington County Sheriff's Office in Oregon. Mina initially insisted that none of the security cameras being used in the program were outside the department's headquarters. A day later, he backtracked.

Details on how the technology works have been shielded by a nondisclosure agreement Orlando signed with Amazon last year. The contract says the city can't release information about Rekognition without written consent and approval from Amazon.

click to enlarge One of Orlando’s roughly 180 IRIS cameras, one of the models used in conjunction with Amazon Rekognition - PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette
  • One of Orlando’s roughly 180 IRIS cameras, one of the models used in conjunction with Amazon Rekognition

Almost three weeks after the pilot program ended on June 19, city officials released a memo announcing a plan to begin a second testing period with Amazon, saying Orlando staff had "made good strides in testing this pilot program and believe it is important to continue this evaluation period." Although the memo says "continue" rather than "start new," the original program ended June 19 and no new statement of work has been signed, according to publicly available records.

"We don't even know if the product works," says Mina, who nonetheless endorsed the program in a statement posted to Amazon's website in May that has since been deleted. "If the technology works, [we] intend to use it for those worst-case scenarios, for the most violent people out there – your sexual predators, people who have committed heinous crimes, murder, and that sort of thing, and as well to locate missing persons and missing juveniles."

Orlando hasn't created any rules or procedures regarding its use of Rekognition. Mina insists it will only be used to uphold public safety by apprehending people who already have warrants out for their arrest. OPD would "absolutely not" use facial recognition to identify people at rallies, Mina says. "We don't have a history of that – of tracking protesters, of tracking people who show up to rallies. So, why would we use our technology to do that? This is strictly public safety."

"A perfect example is Markeith Lloyd, who was a wanted murder suspect running around Orlando for many, many weeks, going in and out of stores, in and out of the Walmart," Mina says, referring to the elusive murder suspect who killed Lt. Debra Clayton after shooting his pregnant ex-girlfriend a month before. "Say we had Amazon Rekognition in place, and we saw Markeith Loyd walking into the Walmart and it recognized that person as Markeith Loyd. You know, we would send officers right there, to take a dangerous criminal off the streets."

But Mina and city officials have implied that the technology could be used to stop future attacks as well, like a planned school shooting or the man who police allege intended to kidnap pop singer Lana Del Rey in February after her concert at Amway Center.

"This entire nation and everyone is looking to law enforcement to protect their children in schools," Mina says. "If we were to get a social media threat and identify a person – Joe Smith says he's going to this school to blow up the school, to kill a bunch of children, certainly we would want to be alerted to that long before there was a warrant for that person's arrest."

In the city's July 6 memo, officials pointed to the success of facial recognition after a gunman killed five Capital Gazette employees in Annapolis, Maryland. But the technology used by Anne Arundel County Police to identify that suspect is nothing like how Orlando wants to use Rekognition.

"It's different," says Marc Limansky, spokesperson for Anne Arundel County Police. "It's not that. I don't know if the public would stand for that."

Limansky says the suspect, Jarrod Warren Ramos, did not want to give investigators his name when he was apprehended. After a lag in getting fingerprint results, police took a photo of Ramos and sent it to the Maryland Coordination and Analysis Center, where it was compared with millions of images in the Maryland Image Repository System, or MIRS. The repository contains the state's entire driver's license database and mugshots of known offenders, as well as access to the FBI's database of nearly 25 million mugshots.

click to enlarge A simplified graphic modeled from the city’s planned Rekognition “architecture,” obtained through public records requests, illustrates the path of a camera’s video feed as it would be used in coordination - with Amazon Rekognition facial recognition software. - INFOGRAPHIC BY JOEY ROULETTE
  • Infographic by Joey Roulette
  • A simplified graphic modeled from the city’s planned Rekognition “architecture,” obtained through public records requests, illustrates the path of a camera’s video feed as it would be used in coordination with Amazon Rekognition facial recognition software.

Old methods of identifying a suspect could be tedious – investigators would have to look through a stack of photos by hand, one by one. With MIRS, it's like looking through that stack all at once, Limansky says, and police were able to identify Ramos effectively.

Over 117 million Americans are already in law enforcement facial recognition networks, according to a 2016 report released by the Center on Privacy & Technology at Georgetown Law. During the study, only the Los Angeles Police Department claimed to run real-time face recognition off street cameras, though four other law enforcement agencies had expressed interest.

Similar to MIRS, the Rekognition program set up by Amazon in Washington County, Oregon, works with photos or still images captured from video footage, says Deputy Jeff Talbot, a spokesperson for the Washington County Sheriff's Office.

Amazon helped Washington County create a database of about 300,000 booking mugshots collected from the county jail. For about $6 to $12 a month, deputies can upload a photo of an unidentified suspect and scan for a possible match. The results come back within seconds.

"The most common ways it's used is when someone goes to a store, they go to a [register] where there's a little camera inside the kiosk," Talbot says. "They act like they're putting stuff in there to pay for it, but before they pay, they grab the bag and walk out the door. We then take that video, we just crop it to a still image of their face, and we run it through our database."

The Washington County Sheriff's Office could also use still images taken from publicly available Facebook videos and footage recorded by a victim on a phone during a crime, as well as photos of the dead and people who can't identify themselves because of a physical or mental incapacitation.

But Talbot is adamant that Washington County is not using Rekognition for mass surveillance or real-time surveillance.

"Our own state laws and our own policies we've written prohibit us from such a use," he says. "We do not want to be attributed to using technology in a manner that we're not using it and don't have any intention to use it."

Talbot says that for Rekognition to be used on live video under their policies, "you'd have to know that everyone within that frame was a suspect of a crime."

"We don't think it should be used for mass surveillance. We feel like we've struck a great balance. We're honoring people's civil liberties here. We're still doing what the public expects, which is fighting crime and solving crime."

Washington County's policies state that without any other evidence, deputies cannot use a facial recognition match as probable cause for arrest or seizure.

"It's a tool," Talbot says. "Investigators are required by our policy to independently corroborate who that person is if they are one of those potential leads that the software came back with as a match."

OPD has experience with the type of facial recognition used by Washington County. For some time, the department and more than 240 local, state and federal agencies have been accessing a facial recognition database that stores photos of every licensed driver in Florida.

Since 2001, the Pinellas County Sheriff's Office has run the Face Analysis Comparison & Examination System, also known as FACES. The database collects tens of millions of images from Florida driver's licenses and IDs, plus offender mugshots from state prison and county jail bookings. Law enforcement officers must upload a photo of the person they want to identify into the FACES system, which then compares it against the database for a possible match. All together, these agencies run close to 8,000 searches through FACES each month, and they aren't required to have "reasonable suspicion" to run a search, according to the report from the Center on Privacy & Technology.

But a number of issues can affect how FACES works.

Records from the Pinellas County Sheriff's Office show the system's performance is adversely affected when officers upload photos of poor quality. The system also experiences kinks when people of interest have significantly aged, are twins, or have facial hair, are wearing sunglasses or have undergone plastic surgery.

In addition, the Center on Privacy & Technology found the database was subject to little oversight – Pinellas Sheriff Bob Gualtieri told researchers the system was "not really" audited to look for potential abuse.

Currently, the use of facial recognition is being tested in Florida's First District Court of Appeals. Detectives from the Jacksonville Sheriff's Office used FACES to get leads on a man who bought $50 worth of cocaine, according to the Florida Times-Union. But neither deputies nor prosecutors disclosed that information to attorneys defending Willie Allen Lynch. The court will determine whether the state is obligated to provide defendants with all of the photo matches returned by FACES.

click to enlarge A recently replaced camera in downtown Orlando - PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette
  • A recently replaced camera in downtown Orlando

Mina says his officers have to disclose on their reports if they used FACES to identify a suspect. He's not sure how many searches a month his department runs through the database.

"Say we do get a match – we don't use that match for probable cause," he says. "If we got what we thought was a match, we would use other investigative means to make sure that person was identified correctly."

Mina acknowledges that there's a "big difference" between the two facial recognition technologies. "FACES is for after-the-fact crimes committed, and it's a still image," he says. "The Amazon Rekognition program is real-time. If a suspect, someone with a warrant, or a missing child is walking down the road in real time, what we hope is that it will recognize that face and alert us."

Despite the lure of potentially saving lives, Brian Brackeen refuses to sell his tech to the cops.

Brackeen is the founder and CEO of the Miami-based company Kairos, which develops facial recognition software for private entities and specializes in emotion detection. He's sold to banks and political groups, but not the government.

A few times, Miami-Dade County officials came to him for facial recognition software that could help catch child predators or keep the Super Bowl safe, he says. But Brackeen turned them down because the risks are too great – operators often aren't given anti-bias training and few internal checks exist.

"We get facial recognition right more than 99 percent of the time," he says. "When we don't get it right, a customer might have to scan their face again at the checkout of a business. It's not life or death. The problem with government facial recognition is that people can lose their lives when things go wrong."

Brackeen, one of the few black founders of a facial recognition company, adds that existing facial recognition software has not been exposed to enough images of people of color to confidently identify them.

The risk of misidentification spans all brands, including Rekognition. And having police officers act on a first match in the event of another Markeith Loyd, as Mina envisions, could lead to wrongful arrests or even deaths, critics warn.

As Matt Cagle, a civil liberties attorney at the ACLU, puts it: "There may be a misidentification that results in a law enforcement encounter that should never have happened, or the use of force based on an inaccurate scan of somebody's face. The burden is really on Orlando and Amazon to show that this technology actually works, because right now we just don't have the public evidence to demonstrate that for Rekognition."

On July 26, the ACLU published a study showing that Rekognition incorrectly matched – at the default confidence threshold of 80 percent – 28 members of Congress with other people who have been arrested for a crime when their portraits were scanned through a database of 25,000 mugshots. The mismatches were disproportionately of people of color, including civil rights legend U.S. Rep. John Lewis.

In addition, the ACLU obtained a training PowerPoint used by Washington County that featured a screengrab of the Rekognition software showing an uploaded test photo of O.J. Simpson matching a white man's mugshot. Despite clear differences in hair and facial complexion, the results came back as a "93.53 percent match." An annotation below the screengrab reads, "As you can see despite the high percentage value returned, it still requires human interpretation to determine if there is an actual match."

In a statement, Amazon said the ACLU's study could have been improved by following "best practices" in setting confidence thresholds, which is the percentage likelihood that Rekognition found a match.

"While 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," the statement said. "When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95 percent or higher."

Later, an artificial intelligence engineer from Amazon, Matt Wood, called the study a misinterpretation of the software's capabilities. Amazon now said it recommended law enforcement agencies using Rekognition set the confidence threshold at 99 percent.

Wood, though, acknowledged the need for some types of regulation on the technology. "We should not throw away the oven because the temperature could be set wrong and burn the pizza," Wood wrote in a statement on Amazon's website. "It is a very reasonable idea, however, for the government to weigh in and specify what temperature (or confidence levels) it wants law enforcement agencies to meet to assist in their public safety work."

click to enlarge PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette

Facial recognition technology has an algorithmic bias that Joy Buolamwini, founder of the Algorithmic Justice League, calls the "coded gaze."

"A.I. systems are shaped by the priorities and prejudices – conscious and unconscious – of the people who design them," Buolamwini wrote in June for the New York Times.

During her years as a graduate student at the Massachusetts Institute of Technology, Buolamwini put facial recognition software from Microsoft, IBM and Face++ to the test by leveling each software's matching accuracy based on race and gender. The study found that across all brands, there is an average error rate of 0.5 percent for light-skinned men, compared to an error rate of 30 percent for dark-skinned women – a substantial gap possibly attributable to the unconscious bias displayed by many tech companies in which developers are mostly white and male.

In a later study, she found similar results with Rekognition, which she shared first in a personal letter to Amazon CEO Jeff Bezos and then with Orlando Weekly.

"Amazon Rekognition performs better on lighter-skinned faces than darker-skinned faces with an accuracy difference of 11.45 percent," she wrote to Bezos. "It also performs better on male faces than female faces with an accuracy difference of 16.47 percent."

People in Orlando would not be suspects just because Rekognition scans their faces, Mina argues.

"If you think about police officers on the street right now, they're constantly looking at people in the public, so there's no violation of privacy there," he says. "We've had hundreds of cameras in the city of Orlando for over 15 years recording people. There's no privacy issue if you're out in public."

But isn't there a difference between being glanced at by a cop and being analyzed by an algorithm that is tracking the movements of dozens of people and looking for a match? Mina doesn't think so.

"It's just a matter of technology, right?" he says. "Technology is neutral. It's not good or bad – it's how we use that technology. People use the internet for evil all the time. Law enforcement uses the internet for good all the time. So that's what we're using it for. We're using the technology for good, for public safety."

But some civil libertarians see Rekognition as a step toward what's already happening in other parts of the world. The Chinese government, for example, has created an extensive network of surveillance cameras that use facial recognition technology to not only arrest criminal suspects but also monitor ethnic minorities. Chinese officials are also in the process of creating a "social credit system" to publicly rank the trustworthiness of its 1.3 billion residents by tracking their every behavior.

click to enlarge PHOTO BY JOEY ROULETTE
  • Photo by Joey Roulette

The condemnation of Orlando's use of Rekognition has been widespread. Locally, 10 organizations, including the Arab American Community Center of Florida, Mi Familia Vota, NeJame Law and Organize Florida, have called on OPD to stop using the software because of its potential to be "used for discriminatory immigration enforcement, monitoring individuals who attend protests and engage in other non-violent activities, or disproportionately surveilling minority communities and residents who have committed no crimes."

Some Amazon workers have also asked Bezos to stop selling the technology to law enforcement.

"We don't have to wait to find out how these technologies will be used," they wrote in an internal letter. "We already know that in the midst of historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses – this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized."

Earlier this month, tech giant Microsoft asked Congress to set regulations for the use of facial recognition.

"If we move too fast with facial recognition, we may find that people's fundamental rights are being broken," Microsoft president Brad Smith wrote in a blog post.

The deployment of facial recognition technology is happening faster than efforts to curtail potential abuse, says Clare Garvie, an associate with the Center on Privacy & Technology. Garvie says it's encouraging to hear assurances from OPD that it will only use Rekognition to find people with outstanding warrants. But until those regulations are codified, the public will have to take law enforcement's word for it.

"This kind of facial recognition technology is akin to police walking through a public protest and making everyone show their identification," she says. "We would be appalled if that was permitted. Facial recognition enables a world where law enforcement can do that in secret without anybody knowing it can take place."

In the coming months, the City Council will decide whether to continue with the software. Now is the time for privacy advocates to push back, argues Lynch of the Electronic Frontier Foundation.

The Fourth Amendment protects against warrantless wiretaps by government agencies, Lynch says. There should be strict restrictions on when and how police can track people using facial recognition. If not, she fears the practice will become normalized – and then who knows what will happen.

"It completely changes the society in which we live," Lynch says. "It destroys the ability to live in a true democracy if we feel the government is watching us everywhere we go all the time."

Tags: ,

Newsletters

Never miss a beat

Sign Up Now

Subscribe now to get the latest news delivered right to your inbox.

Calendar

© 2018 Orlando Weekly

Website powered by Foundation