Brooklyn Boro

OPINION: How curated articles could help Facebook fight fake news

December 6, 2016 By Amanda Hoover Christian Science Monitor
Cartoon courtesy of Cagle Cartoons
Share this:

In an attempt to curtail fake news on its network, Facebook could be eyeing a tool that would push stories from established, valid sources onto its users, according to sources close to the project.

As fake news stories have spread across the internet, some have pointed the blame at social media sites like Facebook, where users have nearly unfettered access to share unchecked information with large audiences. Facebook’s founder and chief executive Mark Zuckerberg has previously argued that the site is meant to serve as a tech company rather than a media entity, and has deflected responsibility for the consequences of fake news posts — many of which favored President-elect Donald Trump over his challenger Hillary Clinton in the 2016 presidential election — onto others.

But now, the company seems to be paving the way for others to follow in a new direction, one that aims to cut back on fake news and provide its users with reliable and valuable content. 

Facebook has designed its latest feature, deemed “Collections,” to mirror the Snapchat “Discover” feature, building partnerships with trusted content producers and featuring their work in a space on the site that users can trust for valid news, sources close to the project told Business Insider. 

Subscribe to our newsletters

The site plans to insert the posts into users’ news feeds, which would give reputable publications direct access to the network’s growing base of users — an option which could be more appealing to publishers than posting from their own pages or buying sponsored posts. 

The initiative seems stand in contrast to comments Mr. Zuckerberg made less than a month ago, when he argued that it was unlikely that fake news on the social media platform had played a significant role in electing Donald Trump to the presidency. 

“I do think that there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way that they did as because they saw some fake news,” Zuckerberg said in the days following the election. “I think if you believe that, then I don’t think you have internalized the message that Trump supporters are trying to send in this election.”

But just a week later, he changed his tune and announced new Facebook policies to help identify and crack down on the spread of fake news, including writing algorithms that can automatically detect false content, placing warning labels on news that may be fake and allowing users to flag content they find suspicious on their own.

“We are raising the bar for stories that appear in related articles under links in News Feed,” Zuckerberg wrote at the time. “A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ad policies like the one we announced earlier this week, and better ad farm detection.” 

An attempt to build “Collections” seems to follow through on that promise. But while the service will provide verified news to consumers on Facebook, it’s unclear whether users will access it, much less accept it.

As the 2016 campaign served to divide the nation in unprecedented ways, more people turned inward to seek message-based news that aligned with their own viewpoints, feeding into a media-bubble culture in which their own views are supported and echoed back to them. Because many relied on news that fed into an internal bias, a significant amount of people met the election’s outcome with shock, wondering how Mr. Trump could have defied the predicted victory of Mrs. Clinton.

“Americans are … likely to get what they do know, or think they know, from an echo chamber,” Krista Jenkins, professor of political science at Fairleigh Dickinson University in Teaneck, N.J., previously told The Christian Science Monitor. 

“What’s needed in our discourse is a cross-pollination of ideas and viewpoints so that we begin to turn the tide on the alarming trend of seeing the other side as dangerous and misguided,” she added, “rather than those whose experiences and perspectives lead them to believe different things about where to go and how to get there.” 

While Facebook’s attempt to curate a source for trusted news could be a step in the right direction, breaking news consumers of the pattern could be a more difficult task. Some publications or shows are trying to engage the other side, including the left-leaning “Daily Show,” which last week featured conservative commentator Tomi Lahren of TheBlaze, but experts say that might not be enough to make Americans see the value in getting the whole picture.  

“These bubbles have not been imposed upon the public — it was what the people want,” Paul Levinson, a professor of communication and media studies at Fordham University in New York, previously told the Monitor. “As long as social media continues to provide a very easy forum for these news bubbles … it is not going to stop, and some late-night talk shows are not going to be enough to do that.”

 

© 2016 The Christian Science Monitor


Leave a Comment


Leave a Comment