Facebook, which is striving hard to make its platform sanitised from fake news and echo chambers, has realised that it can’t guarantee that social media is not harmful to democracy.
In a blog post, Product Manager Samidh Chakrabarti said on Tuesday that he is not blind to the damage that the Internet can do to even a well-functioning democracy.
“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can’t,” Chakrabarti said.
“That’s why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible,” he added.
Facebook CEO Mark Zuckerberg has also pledged to “fix” Facebook in 2018 by reducing hateful content and enhancing experience on his platform for over two billion users.
“This is a new frontier and we don’t pretend to have all the answers. But I promise you that my team and many more here are dedicated to this pursuit,” said Chakrabarti who is responsible for politics and elections products globally.
The 2016 US presidential election brought to the fore the risks of foreign meddling, “fake news” and political polarisation.
“Around the US 2016 election, Russian entities set up and promoted fake Pages on Facebook to influence public sentiment — essentially using social media as an information weapon,” Chakrabarti noted.
Facebook discovered that these Russian actors created 80,000 posts that reached around 126 million people in the US over a two-year period.
“It’s abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society. This was a new kind of threat that we couldn’t easily predict, but we should have done better,” the post further read.
The Russian interference worked in part by promoting inauthentic Pages, so “we’re working to make politics on Facebook more transparent”.
“We’re making it possible to visit an advertiser’s Page and see the ads they’re currently running. We’ll soon also require organisations running election-related ads to confirm their identities so we can show viewers of their ads who exactly paid for them,” Chakrabarti said.
“Finally, we’ll archive electoral ads and make them searchable to enhance accountability,” he added.
To make it easier to report false news, Facebook has taken steps in partnership with third-party fact checkers to rank these stories lower in News Feed.
“Once our fact checking partners label a story as false, we’re able to reduce future impressions of the story on Facebook by 80 per cent,” Chakrabarti noted.
One of the most common criticisms of social media is that it creates echo chambers where people only see viewpoints they agree with — further driving us apart.
“A better approach might be to show people many views, not just the opposing side.
“We recently started testing this idea with a feature called Related Articles that shows people articles with a range of perspectives on the news,” Chakrabarti said.