Home IT Info News Today 2016: The Year Facebook Became the Bad Guy

2016: The Year Facebook Became the Bad Guy

319

Mark Zuckerberg started 2016 with a cookie cutter message of hope. “As the world faces new challenges and opportunities, may we all find the courage to keep making progress and making all our days count,” he wrote on his Facebook wall on 1 January. He and his wife, Priscilla Chan, had just had their daughter, Max, and had been sharing warm and fuzzy photos of gingerbread houses and their dreadlocked dog Beast over the holiday season.


Then 2016 happened. As the year unfurled, Facebook had to deal with a string of controversies and blunders, not limited to: being accused of imperialism in India, censorship of historical photos, and livestreaming footage of human rights violations. Not to mention misreported advertising metrics and the increasingly desperate cloning of rival Snapchat’s core features. Things came to a head in November, when the social network was accused of influencing the US presidential election through politically polarized filter bubbles and a failure to tackle the spread of misinformation. The icing on the already unpalatable cake was Pope Francis last week declaring that fake news is a mortal sin.


This was Facebook’s annus horribilis. Mark Zuckerberg must long for the day when his biggest dilemma was deciding which grey T-shirt to wear on his first day back at work.


It wasn’t all bad. None of these controversies made a dent on the bottom line; Facebook had a bumper year for advertising revenue, and the $3bn investment to tackle “all diseases” (no big deal) through the Chan Zuckerberg Initiative was well received.


But this year has revealed how difficult it has become for the social network to stand behind its mission to “make the world more open and connected” when the decisions it makes can be so divisive.



Unprecedented Power


Thanks to its 1.79bn users and how much it knows about them, Facebook rakes in billions in advertising. In the first three quarters of this year, the company made almost $6bn in profit — a big jump from a mere $3.69bn in 2015. “They have perfected advertising in a way that makes it extremely enticing. It’s so easy to place an ad and get immediate results,” said media expert Gordon Borrell, whose analysis suggests that Facebook has taken $1bn away from print publications in the past year. For every new dollar spent by brands online, a whopping 85 cents goes to Facebook and Google at a time when traditional publishers are facing layoffs.


Some believe Facebook has become too big to be regulated effectively.


“We don’t have the right regulatory paradigm for these globe-striding technology giants,” said Carl Miller, research director at the Centre for the Analysis of Social Media at the think-tank Demos. “We treat them like neutral utility companies but they are value-maximising commercial entities.”


Facebook is a monopoly with too much power, argues author and activist Robert McChesney. “When you get companies this big they are not just a threat to democracy, but they are also a threat to capitalism. They suck investment capital and profits away from smaller businesses and screw over the competitive sector.”


He has an extreme solution: if Facebook can’t be regulated effectively, it should be nationalised to ensure it acts in the interest of the public.


McChesney scoffs at the suggestion that Facebook is acting democratically by serving its many users. “That’s self-serving garbage,” he said.


Does it not make a difference that Mark Zuckerberg is a principled CEO with good intentions? Not according to McChesney: “I am sure the people who produced napalm thought they were doing a good service to protect the free world.”



Digital Colonialism


One of 2016’s earliest missteps was Facebook’s mishandling of Free Basics. The company pitched Free Basics as a way to give internet access, and all the wonderful benefits it can unlock, to the world’s poorest people. The catch: it wasn’t real internet access, but a selection of apps and services curated by — and always including — Facebook. In February, the Indian government rejected Free Basics over its violation of the tenets of net neutrality following a public debate in which Facebook was accused of digital colonialism. It was an expensive and embarrassing blow for the social network and indicative that not everyone finds its brand of Silicon Valley techno-utopianism palatable. To compound the issue, Facebook board member Marc Andreessen reacted on Twitter with the tone-deaf and contemptuous line: “Anti-colonialism has been economically catastrophic for the Indian people for decades. Why stop now?”


Nitin Pai, director of the Takshashila Institution, an Indian thinktank, and a critic of Free Basics, said: “Facebook and Mark Zuckerberg must take a long, hard look at what are the values it wants to strengthen or weaken in this world… Unlike other multinational firms that merely sell goods and services to people across the globe, Facebook enables connections among them. It cannot take the usual, and usually untenable, ‘we are apolitical’ route to international business.”


Indeed, so political are Facebook’s global expansion plans that they are said to be working on a “censorship” tool that would allow them to operate in China once again.



Censorship and Accountability


Censorship has been a running theme on Facebook in 2016. Despite insisting it is not a media company and is not in the business of making editorial judgments, Facebook, it seems, is all too happy to censor content when that content violates its own policies or at the request of police. This has led to a number of high-profile blunders in 2016, including the removal in September of the iconic Vietnam war photograph ‘napalm girl’ from a Norwegian journalist’s post and the deletion of a breast cancer awareness video in October. In both cases, human moderators made bad judgment calls that the algorithm then enforced across the site — to widespread criticism.


In August, Facebook deactivated Korryn Gaines’ profile during an armed standoff with police at the request of the Baltimore County police department. Gaines, who was later killed by police, had been posting to the social network after barricading herself inside her apartment and aiming a shotgun at police. The incident highlighted the existence of an emergency request system that police can use to get Facebook to take content down without a court order if they think someone is at risk of harm or death.


Elsewhere, Facebook suspended live footage from the Dakota Access pipeline protests and disabled Palestinian journalists’ accounts; there were also reports it had removed Black Lives Matter activists’ content.


The lack of transparency over this process led to a coalition of more than 70 human and civil rights groups demanding that Facebook be more transparent about its takedown processes and arguing that censorship of user content depicting police brutality at the request of authorities “sets a dangerous precedent that further silences marginalized communities.”


Reem Suleiman of the not-for-profit organisation SumOfUs added: “There’s a lot of doublespeak. Zuckerberg talks about being a human rights defender and champion of civil liberties protection. He hung a Black Lives Matter banner outside of Facebook. These are ideals that the company is claiming to promote, so it’s totally fair to hold them to account.”


Suleiman fears that under Trump’s administration, surveillance and silencing of minorities, particularly Muslims and undocumented immigrants, could become more commonplace. “Facebook has an ethical duty to protect its users,” she said.



Effect on the Election


None of 2016’s controversies have rattled Facebook as much as the criticism that its failure to clamp down on fake news combined with the way its algorithm places users in polarized filter bubbles shaped the outcome of the presidential election.


“It’s crazy that Zuckerberg says there’s no way Facebook can influence the election when there’s a whole sales force in Washington DC that does nothing but convince advertisers that they can,” said García Martínez, who used to work in Facebook’s advertising sales department. “We used to joke that we could sell the whole election to the highest bidder.”


In the run-up to the election, misinformation and fake news — such as articles suggesting Hillary Clinton was a murderer or that the pope endorsed Trump — proliferated on social media so feverishly that even Barack Obama said it undermined the political process. Macedonian teenagers built a cottage industry of pro-Trump fake news sites, motivated by the advertising dollars they could accrue if their stories went viral.


Widespread outrage over the issue led to an internal mutiny and an uncharacteristic climb-down from Zuckerberg. Having initially denied any responsibility, he wrote an apologetic post outlining ways the platform would tackle the problem, including building tools to detect and classify misinformation.


This, combined with the cases of censorship, points to the inevitability of Facebook accepting it is a media company and not just a neutral technology platform.


“Mark Zuckerberg is now the front-page editor for every news reader in the world. It’s a responsibility he’s not choosing to accept,” Martínez said.


Claire Wardle, from First Draft News, thinks that is changing. “They may not have said it yet, but 2016 is the year Facebook recognized they are a publisher.” The company is simply reluctant to admit it because “it’s a nightmare.”


“We’ve never had a global newspaper in 192 countries, with all these different legal and cultural contexts and languages,” she said.


She points out that Facebook has been very diligent at policing the platform for sexual content and bullying, but now has to do the same for misinformation with a combination of expert human judgment and software. It’s not going to be…

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here