HELP US KEEP REPORTING. DONATE TO ORLANDO WEEKLY PRESS CLUB.

Privacy is disappearing faster than we realize, and the coronavirus isn’t helping 

Page 2 of 2

The majority of Americans (79 percent) say they're concerned about how companies use their data, yet the same Pew Research Center data from late 2019 showed that only about one in five Americans usually read through the privacy policies that grant companies broad use of their data.

King says she's often asked what individuals can do to protect their privacy, but there's very little you can do as one person to protect yourself against the biggest threats.

"It'll probably require industry-level solutions or legislated solutions, as opposed to flipping some knobs on your cell phone. That's the fundamental problem," King says.

Plus, for users to opt out, they need to know the companies that have their data, Engelman says.

"The dirty secret for that is the companies themselves don't know who they're sharing the data with," Engelman says.

Advertisers collect information so dynamically, in the very moment that people are using apps, that many companies would likely have a hard time qualifying how that data was shared, he says.

It's important to recognize the limitations that exist for consumers, and push for informed consent, he says.

That includes knowing the full context of how the data you choose to share may be passed on. If a consumer agrees to share their location with a weather app, they likely only expect that location to be used to pull up their local forecast. Any secondary use of that location information should require consent, and not just fall under an umbrella privacy policy that no one is actually going to read, he says.

"What I would like to see is that people have enough information to make informed decisions," Engelman says.

SMART ASSISTANTS AND THE INTERNET OF THINGS

Unlike concerns about smartphone listening capabilities, if you've bought a smart home assistant like Amazon's Alexa or Google Home, you likely understand that on some level, the device needs to be listening in order to hear its wake-up command.

To have Alexa turn off your lights, or read you a recipe, the smart speaker needs to first catch the magic words that indicate you want her to do something.

But as smart assistants started rolling out in recent years, it wasn't initially clear just how easily those devices would accidentally pick up audio they weren't meant to hear, or that it would be listened to by other people.

After consumers complained of odd behaviors with Alexa, the most popular smart assistant, it was revealed that recordings captured by the devices are sent to Amazon, where employees listen for the sounds and phrases that may trip up the system in order to improve its accuracy. But as you can imagine, some recordings made in error captured snippets of private conversations and even people having sex.

"From a privacy standpoint, what a disaster," says King.

It would've been easier if Amazon had first asked people to opt in and share their recordings, explaining that they'd be used to make the system better, similar to when a computer program crashes and asks for permission to send an error report, she says.

Instead, the default setting remains that Amazon can use recordings to improve its service, but users now have the option to opt out.

As many other home devices become more connected, creating the so-called "Internet of Things," other privacy risks are popping up.

Some smart TVs now include microphones and cameras that could be hacked by stalkers or the government to watch people in their living rooms and bedrooms. Less nefariously, most smart TVs collect every detail of what you watch to target show suggestions and ads.

Amazon's Ring Doorbell security system widely shares videos with law enforcement if users agree, raising questions of how those images could be used for other purposes, like facial recognition. The company also shares user information with third parties, sending the full name, email address and number of devices a user has to the analytics firm MixPanel, according to a January report from the Electronic Frontier Foundation (EFF), a nonprofit that fights for civil liberties. In 2019, hackers exposed vulnerabilities in the system by getting access to the cameras and using the built-in speaker to talk to children in their homes.

While many systems offer some way to opt out of their tracking, King notes that consumers should assume their devices will default to the broadest possible sharing of their data.

FACIAL RECOGNITION

Americans learned of another wide-reaching privacy overreach early this year, when the New York Times reported on a company called Clearview AI. Clearview had created a massive database of photos scraped from public posts on social media and across the web, in order to create a powerful facial recognition tool that allows users to find out who someone is, and even links back to the original posts.

The Times reported that the tool was being used by hundreds of law enforcement agencies, and was more comprehensive than any recognition tool created by the government or other Silicon Valley companies.

"The tool could identify activists at a protest or an attractive stranger on the subway, revealing not just their names but where they lived, what they did and whom they knew," the Times reported, noting just a few of the potential implications of such a tool.

Face recognition by law enforcement is, for the most part, very loosely regulated, which leads to significant issues, according to research by the Georgetown University Center on Privacy and Technology.

In some cases, police departments have used photos of celebrities they claim look somewhat like a suspect in order to search for matches. In others, departments have uploaded composite sketches, which led to matches with people who looked far different from the eventual suspect connected with the crime, the center reports.

In one case highlighted in the center's "Garbage In, Garbage Out" report, the New York Police Department wasn't getting any matches with a photo of a black man whose mouth was wide open. So the department Googled "Black male model" and edited another man's closed lips onto his face in order to try to find a match, says Jameson Spivack, a policy associate with the Georgetown center.

"You can see first of all, fabrication of evidence, and second of all, the racial implications of this thing," Spivack says. "It's really wild the kinds of things they've done."

Importantly, face recognition gives government power they've never had before, Spivack says.

"In 2015, police in Baltimore County used face recognition on the Freddie Gray protesters to locate, identify and arrest people who had unrelated outstanding arrest warrants," Spivack says. "This is a politically protected demonstration, and without the protesters being aware of it, the police were using facial recognition to identify people with completely unrelated warrants and target them from the crowd."

The technology also struggles with accuracy, and has issues with identifying people of color, women and younger people, he says. With no regulations to audit systems for accuracy, errors can persist.

Some states enter driver's license photos into face recognition databases, while others only include mugshot photos. When the Georgetown center researched how widespread databases were in 2016, they found that about 54 percent of Americans were included in at least one database, Spivack says.

"A majority of Americans are subjected to face recognition," he says. "It's very likely that has increased, but we have no way of knowing."

Washington state passed facial recognition legislation this year that Microsoft has been pushing in other states around the country, Spivack says. The rule requires government agencies to write an accountability report before using the technology, have a policy for external information sharing, and train officers in proper use.

The rule also requires a warrant for ongoing or real-time surveillance, but all other uses are allowed, which is troubling, Spivack says. Trying to identify someone with the technology constitutes a search, he argues, and should require probable cause.

"One way to think about this is if you're in a face recognition database, you're essentially in a perpetual lineup, you're always a suspect who could come up," he says. "A lot will say, 'Well, I didn't commit a crime.' It's not really about that. It's more, 'Does an error-prone, biased technology think you committed a crime?' Then you have to worry."

Until the kinks in the technology are worked out and proper protections of constitutional rights are codified, the center and other privacy rights groups are advocating that states implement a moratorium on the use of facial recognition.

MEANINGFUL LEGISLATION

Europe's General Data Protection Regulation, which took effect in May 2018, is the strictest data protection policy in the world. It requires companies to inform users of what data will be collected, how it will be used, allow editing or deletion for some types of data, and on request, companies need to provide users with all the data they have on them.

Companies that don't comply with those and other rules can be fined millions of dollars.

Many want to push for something similar or even more protective in America.

Currently, California is the only state to have passed a similar level of protection, with the California Consumer Privacy Act.

This year, Washington state, home to tech giants Microsoft and Amazon, came close to passing an even more protective measure than California's called the Washington Privacy Act, which would have required companies to conduct risk assessments and allow people to edit or delete their data.

But the measure failed when lawmakers couldn't agree on how it should be enforced. One contingent wanted the state Attorney General's office to be responsible for enforcement, while the other also wanted the right to private action.

Privacy advocates, including the American Civil Liberties Union of Washington, point out that the act was also full of loopholes, and it would have prevented local jurisdictions from passing more protective legislation.

"It was astonishing to see all the places where rights that were listed were circumvented by exemptions," says Jennifer Lee, the technology and liberty project manager for ACLU Washington. "How can you say consumers actually have meaningful rights if they're not enforceable and undermined by a laundry list of loopholes?"

While state legislation can fill an important vacuum in data protection laws, Washington state Senate Majority Leader Sen. Andy Billig, D-Spokane, says he thinks federal standards would better protect all citizens.

"While I think Washington is generally a leader in technology and consumer protection, and it would make sense for Washington to be a leader in this area, ultimately federal legislation would be the best so there's one standard throughout the country," Billig says.

As it happens, Washington politicians are also leading on the issue at the federal level. Sen. Maria Cantwell, D-Washington, introduced the Consumer Online Privacy Rights Act with Democratic leadership in late 2019. The act would ensure, among other things, that people around the country have the right to: access their data and see how it's being shared; control the movement of that data; delete or correct their data; and take their data to a competing product or service. It also provides a right to private action against violators.

But many who work in privacy say proposed rules like COPRA, and even the GDPR, don't go far enough because they require people to opt out instead of opting in.

Protective legislation requires two major questions to be answered, Lee says: For what purpose is your data being collected, and is it collected with your consent?

"You might not know how you're hemorrhaging your data, or who has it, but when aggregated and combined with different data sets, that can really reveal a very intimate picture of your life," Lee says. "And if it's not adequately protected, it can be used to discriminate against anyone in critical decisions, like our health care, housing, education, or loans. It's something everyone should be worried about."

A version of this article first appeared in the Inlander, a weekly based in Spokane, Washington.

We welcome readers to submit letters regarding articles and content in Orlando Weekly. Letters should be a minimum of 150 words, refer to content that has appeared on Orlando Weekly, and must include the writer's full name, address, and phone number for verification purposes. No attachments will be considered. Writers of letters selected for publication will be notified via email. Letters may be edited and shortened for space.

Email us at feedback@orlandoweekly.com.

Orlando Weekly works for you, and your support is essential.

Our small but mighty local team works tirelessly to bring you high-quality, uncensored news and cultural coverage of Central Florida.

Unlike many newspapers, ours is free – and we'd like to keep it that way, because we believe, now more than ever, everyone deserves access to accurate, independent coverage of their community.

Whether it's a one-time acknowledgement of this article or an ongoing pledge, your support helps keep Orlando’s true free press free.

Newsletters

Never miss a beat

Sign Up Now

Subscribe now to get the latest news delivered right to your inbox.

Read the Digital Print Issue

February 24, 2021

View more issues

Calendar

© 2021 Orlando Weekly

Website powered by Foundation