For years, media chased the clicks promised by Facebook; now the social media giant threatens to destroy them 

Under the thumb

Page 4 of 5

FAKING IT

There was a time Facebook was positively smug about their impact on the world. After all, they'd seen their platform fan the flames of popular uprisings during the Arab Spring in places like Tunisia, Iran and Egypt.

"By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible," Zuckerberg bragged in a 2012 letter to investors under the header, "We hope to change how people relate to their governments and social institutions."

And Facebook certainly has – though not the way it intended.

A BuzzFeed investigation before the 2016 presidential election found that "fake news" stories on Facebook, hoaxes or hyperpartisan falsehoods actually performed better on Facebook than stories from major trusted outlets like the New York Times.

That, experts speculated, is another reason why Facebook, despite its massive profits, might be pulling back from its focus on news.

"As unprecedented numbers of people channel their political energy through this medium, it's being used in unforeseen ways with societal repercussions that were never anticipated," writes Samidh Chakrabarti, Facebook's product manager for civic engagement, in a recent blog post.

The exposure was widespread. A Dartmouth study found about a fourth of Americans visited at least one fake-news website – and Facebook was the primary vector of misinformation. While researchers didn't find fake news swung the election – though about 80,000 votes in three states is a pretty small margin to swing – the effect has endured.

Donald Trump has played a role. He snatched away the term used to describe hoax websites and wielded it as a blunderbuss against the press, blasting away at any negative reporting as "fake news."

By last May, a Harvard-Harris poll found that almost two-thirds of voters believed that mainstream news outlets were full of fake news stories.

The danger of fake news, after all, wasn't just that we'd be tricked with bogus claims. It was that we'd be pummeled with so many different contradictory stories, with so many different angles, the task of trying to sort truth from fiction just becomes exhausting.

So you choose your own truth. Or Facebook's algorithm chooses it for you.

Every time you like a comment, chat or click on Facebook, the site uses that to figure out what you actually want to see: It inflates your own bubble, protecting you from facts or opinions you might disagree with.

And when it does expose you to views from the other side, it's most likely going to be the worst examples, the trolls eager to make people mad online, or the infuriating op-ed that all your friends are sharing.

That's partly why many of the 3,000 Facebook ads that Russian trolls bought to influence the election weren't aimed at promoting Trump directly. They were aimed at inflaming division in American life by focusing on such issues as race and religion.

Facebook has tried to address the fake news problem – hiring fact checkers to examine stories, slapping "disputed" tags on suspect claims, putting counterpoints in related article boxes – but with mixed results.

The recent Knight Foundation/Gallup poll, meanwhile, found that those surveyed believed that the broader array of news sources actually made it harder to stay well-informed.

And those who grew up soaking in the brine of social media aren't necessarily better at sorting truth from fiction. Far from it.

"Overall, young people's ability to reason about the information on the internet can be summed up in one word: bleak," Stanford researchers concluded in a 2016 study of over 7,800 students. More than 80 percent of middle-schoolers surveyed didn't know the difference between sponsored content and a news article.

It's why like groups like Media Literacy Now have successfully pushed legislatures in states like Washington to put media literacy programs in schools.

That includes teaching students how information was being manipulated behind the scenes, says the organization's president Erin McNeill.

"With Facebook, for example, why am I seeing this story on the top of the page?" she asks. "Is it because it's the most important story, or is it because of another reason?"

But Facebook's new algorithm threatens to make existing fake news problems even worse, Ingram says. By focusing on friends and family, it could strengthen the filter bubble even further. Rewarding "engagement" can just as easily incentivize the worst aspects of the internet.

You know what's really good at getting engagement? Hoaxes. Conspiracy theories. Idiots who start fights in comments sections. Nuance doesn't get engagement. Outrage does.

"Meaningful social interactions" is a hard concept for algorithms to grasp.

"It's like getting algorithms to filter out porn," Ingram says. "You and I know it when we see it. [But] algorithms are constantly filtering out photos of women breastfeeding."

Facebook hasn't wanted to push beyond the algorithm and play the censor. In fact, it's gone in the opposite direction. After Facebook was accused of suppressing conservative news sites in its Trending Topics section in 2016, it fired its human editors. (Today, conspiracy theories continue to show up in Facebook's Trending Topics.)

Instead, to determine the quality of news sites, Facebook is rolling out a two-question survey about whether users recognized certain media outlets, and whether they found them trustworthy. The problem, as many tech writers pointed out, is that a lot of Facebook users, like Trump, consider the Washington Post and the New York Times to be "fake news."

The other problem? There are a lot fewer trustworthy news sources out there. And Facebook bears some of the blame for that, too.

Tags: ,

Newsletters

Never miss a beat

Sign Up Now

Subscribe now to get the latest news delivered right to your inbox.

Calendar

© 2018 Orlando Weekly

Website powered by Foundation