Accessibility links

Breaking News

Social Media Has Mixed Effect on Democracies, Says Facebook


FILE - Facebook CEO Mark Zuckerberg speaks in preparation for the Facebook Communities Summit, in Chicago, Illinois, June 21, 2017.
FILE - Facebook CEO Mark Zuckerberg speaks in preparation for the Facebook Communities Summit, in Chicago, Illinois, June 21, 2017.

Facebook took a hard look in the mirror with a post Monday questioning the impact of social media on democracies worldwide and saying it has a “moral duty” to understand how it is being used.

Over the past 18 months, the company has faced growing criticism for its limited understanding of how misinformation campaigns and governments are using its service to suppress democracy and make people afraid to speak out.

“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,” wrote Samidh Chakrabarti, Facebook’s product manager of civic engagement.

Since the 2016 U.S. presidential election, Facebook has been looking more critically at how it is being used. Some of what it found raises questions about company’s long-standing position that social media is a force for good in people’s lives.

In December, in a post titled “Is Spending Time on Social Media Bad for Us?” the company wrote about its potential negative effects on people.

The self-criticism campaign extended to Facebook CEO Mark Zuckerberg’s personal goals. Each year he publicly resolves to reach one personal goal, which in the past included learning Mandarin, reading more books and running a mile every day.

This year, Zuckerberg said his goal is to fix some of the tough issues facing Facebook, including “defending against interference by nation states.”

Foreign Interference

During the 2016 U.S. election, Russian-based organizations were able to reach 126 million people in the U.S. with 80,000 posts, essentially using social media as “an information weapon,” wrote Chakrabarti. The company made a series of changes to make politics on its site more transparent, he wrote.

False News

Facebook is trying to combat misinformation campaigns by making it easier to report fake news and to provide more context to the news sources people see on Facebook.

“Even with these countermeasures, the battle will never end,” Chakrabarti wrote.

One of the harder problems to tackle, he said, are so-called “filter bubbles,” people only seeing news and opinion pieces from one point of view. Critics say some social media sites show people only stories they are likely to agree with, which polarizes public opinion.

One obvious solution – showing people the opposite point of view – doesn’t necessarily work, he wrote. Seeing contrarian articles makes people dig in even more to their point of view and create more polarizations, according to many social scientists, Chakrabarti said.

A different approach is showing people additional articles related to the one they are reading.

Reaction to Facebook’s introspection was mixed with some praising the company for looking at its blind spots. But not everyone applauded.

“Facebook is seriously asking this question years too late,” tweeted Jillian York, director for international freedom of expression for the Electronic Frontier Foundation.

XS
SM
MD
LG