Home > Media News > Social Media Companies Must Respond To The Sinister Reality Behind Fake News

Social Media Companies Must Respond To The Sinister Reality Behind Fake News
1 Oct, 2017 / 09:53 AM / OMNES News

Source: https://www.theguardian.com

921 Views

Social media companies such as Facebook and Twitter have begun to share evidence of how their platforms are used and abused during elections. They have developed interesting new initiatives to encourage civil debate on public policy issues and voter turnout on election day.

Computational propaganda flourished during the 2016 US presidential election. But what is most concerning is not so much the amount of fake news on social media, but where it might have been directed. False information didn’t flow evenly across social networks. There were six states where Donald Trump’s margin of victory was less than 2% – Florida, Michigan, Minnesota, New Hampshire, Pennsylvania and Wisconsin. If there were any real-world consequences to fake news, that’s where they would appear – where public opinion was evenly split right up to election day.

US voters shared large volumes of polarising political news and information in the form of links to content from Russian, WikiLeaks and junk news sources. Was this low-quality political information distributed evenly over social networks around the country, or concentrated in swing states and specific areas? How much of it was extremist, sensationalist or commentary masking as news?

To answer these questions, our team at Oxford University collected data on fake news – though we use the term “junk news” because it is impossible to tell how much fact-checking work went into a story just by reading it. But the junk is often easy to spot: extremist, sensationalist, conspiratorial stories, commentary essays presented as news, or content sourced from foreign governments.

Using self-reported location information from Twitter users, we place a third of users by state and create a simple index for the distribution of polarising content around the country. First, we found that nationally, Twitter users got more misinformation, polarising and conspiratorial content than professionally produced news. Second, we found that users in some states were sharing more polarising political news and information than users in other states. Average levels of misinformation were higher in swing states than in uncontested states, even when weighted for the relative size of the user population.

US voters shared large volumes of polarising political news and information in the form of links to content from Russian, WikiLeaks and junk news sources. Was this low-quality political information distributed evenly over social networks around the country, or concentrated in swing states and specific areas? How much of it was extremist, sensationalist or commentary masking as news?

To answer these questions, our team at Oxford University collected data on fake news – though we use the term “junk news” because it is impossible to tell how much fact-checking work went into a story just by reading it. But the junk is often easy to spot: extremist, sensationalist, conspiratorial stories, commentary essays presented as news, or content sourced from foreign governments.

Using self-reported location information from Twitter users, we place a third of users by state and create a simple index for the distribution of polarising content around the country. First, we found that nationally, Twitter users got more misinformation, polarising and conspiratorial content than professionally produced news. Second, we found that users in some states were sharing more polarising political news and information than users in other states. Average levels of misinformation were higher in swing states than in uncontested states, even when weighted for the relative size of the user population.

Political speech has a lot of protection in the US, but the reasonable balance between freedom of speech and election interference has been tipped. There is such a significant volume of misinformation flowing over social media that it is difficult to imagine voters in the US are equipped with what they need to make good decisions. Did voters in swing states get the political news and information they needed to make good decisions? Our conclusion: certainly not.

We did similar research during a less controversial election in Germany and found that for every four stories sourced to a professional news organisation, there was one piece of junk. In part, this healthier ratio is because levels of education are high in Germany, and there is public financing for several kinds of professional news organisations. But the voting public in Germany – and its politicians – are panicked even by this level of misinformation.

But the real strain on democracy kies ahead, not behind. Social networks can be fragile in important ways. If the followers of candidates who lost the election begin to un-friend the followers of candidates who won, our social networks will become even more bounded than they already are. Worse, the politicians who won with the backing of junk news are likely to keep generating it. They may be more likely to consult it when the time comes to make big public policy decisions.

What worries us now is that junk news is becoming a vehicle for junk science. Campaigns of misinformation about climate change, the link between smoking and cancer, and the health benefits of inoculating children also spread like wildfire over social networks. Our next project, with the Oxford Martin School, is to work out who is behind campaigns to persuade our political leaders to ignore scientific recommendations.

It is hard to know what a comprehensive solution might be. Part of the explanation for all this involves significant changes in the business of news and generational differences in how young people consume news. But we are at a point where some kind of public policy oversight is needed, and past the point where voluntary initiatives from social media firms are sufficient.

Facebook and Twitter don’t generate junk news but they do serve it up to us. We must hold social media companies responsible for serving misinformation to voters, but also help them do better. They are the mandatory point of passage for this junk, which means they could also be the choke point for it.

There are some obvious ways to fix this without interfering with political speech. In the US, the Uniform Commercial Code, which is a supplement to contract law in many states, could be the place to make both advertisers and social media companies adhere to some basic anti-spam and truth-in-advertising rules. But in all democracies, paid political content on social media platforms should come with clear disclosures. Bots should declare their paymasters and ads should disclose their backers. Social media platforms should be required to file all political advertising and political bot networks with election officials. Bots should be clearly identified to users.

Most people, most of the time, don’t use social media for politics. But in the days before a major election or referendum social media platforms provide the most important source of information in most democracies. How they design for deliberation is now crucial to the success of democracy.