Home Objective News Today Why Facebook Might Lose War on Fake News

Why Facebook Might Lose War on Fake News

274

It’s impossible to measure the influence fake news stories that circulated on Facebook had on the presidential election, however Facebook has vowed to do “more” to help weed out fictitious news.

It may be a losing battle, according to Jeff Jarvis, a media critic and professor at City University of New York.

“We’re always going to have lies. We’re always going to have idiocies out there. The only way to deal with that is to try to pump in reliable information, true information, reasonable and fact-based information, so we need more and more of that,” Jarvis told NBC News.

Related: What Does Zuckerberg Say About Claims Facebook Influenced the Election?

In a Facebook post on Sunday, CEO Mark Zuckerberg reiterated a point he made earlier in the week at a tech conference that News Feed isn’t perfect but said he doesn’t believe it swayed the election.

With less than 1 percent of stories being deemed “hoaxes,” and only a portion of those political, Zuckerberg said it was “extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”

While Zuckerberg has said the amount of fake news on Facebook is minuscule, it’s not always easy to separate fact from fiction. Some websites post incendiary headlines and have internet addresses that at first glance, may trick readers into thinking they’re reading a story from a legitimate news outlet.

“We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do -. We have made progress, and we will continue to work on this to improve further,” Zuckerberg said.

Then there was Trump’s social media prowess, including a combined tens of million of followers he boasted about on the campaign trail.

“You could argue that Donald Trump was the ultimate meme, click-bait candidate,” Jarvis said. “He knew how to exploit all of this on Twitter and other social media, and we fell right in with that, so we all bear a responsibility for how this election turned out.”

News Feed relies on various signals, including user feedback, to determine which posts may contain inaccurate information and to then reduce their distribution.

While it’s unclear what concrete steps Facebook plans to take in addition to this, Zuckerberg said they’ll have to walk a delicate path moving forward to ensure they’re not stifling differences of opinion.

“I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves,” he said.

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here