Home > Media News > Major Social Media Platforms Failed In Fighting Hate And The Pressure Is Huge!

Major Social Media Platforms Failed In Fighting Hate And The Pressure Is Huge!
20 Dec, 2017 / 07:41 pm / OMNES News

Source: https://www.theguardian.com

1090 Views

Committee says there has been a shift in attitudes at Facebook, YouTube and Twitter but ‘we need you to do more’.

Facebook, YouTube and Twitter have been confronted by MPs with evidence that they failed to take down violent, extremist and political hate speech material despite repeated complaints.

Yvette Cooper, the chair of the home affairs select committee, produced examples including a propaganda video from the banned far-right group National Action, which can still be found on Twitter and Facebook despite eight months of complaints.

During a nearly three-hour questioning of the companies’ representatives, she also highlighted a series of tweets, including violent threats against Theresa May and racist abuse of the shadow home secretary, Diane Abbott, which had not been removed despite repeated complaints.

Cooper said the length of the evidence session reflected the MPs’ “immense frustration” at the scale of the problem. She said she recognised that the companies had undergone “a welcome shift in attitudes” since earlier this year and increased their staff and resources devoted to tackling the issue, but said: “We need you to accelerate and need you to do more.”

On Monday, Twitter suspended the accounts of far-right extremists in the US and Britain, including that of Britain First’s deputy leader, Jayda Fransen, whose three Islamaphobic videos were retweeted by Donald Trump, for violating new rules.

But the representatives of YouTube, Twitter and Facebook said the removal of terrorist – specifically Isis or al-Qaida – and child abuse material was given a higher priority than far-right extremism and political or religious hate speech.

Cases involving political abuse such as “#kickaTory” or even “#killaTory” were much more difficult and depended on context, the MPs were told on Tuesday. “It would be a big step to take down such material in an automated way without anyone seeing it,” said Simon Milner of Facebook.

The companies stressed that progress had been made in the past six months, the number of content reviewers was rising rapidly – Facebook has more than 7,500 and plans to double that number by the end of 2018 – and they were increasingly using machine algorithmic monitoring rather than reporting to tackle the challenge.

Sinead McSweeney, a Twitter vice-president, said the company was suspending 10 times the number of accounts it had in the past. She highlighted the policy that came into effect on Monday dealing with violent extremist groups and hateful imagery and symbols.

In the face of repeated questioning by Cooper as to why specific complaints made by her office in the summer about violent threats against May and Abbott had not been removed, McSweeney said “bystander reports” were treated differently in the past to complaints from the targets of abuse.

She said bystander reports had not been acted upon previously, but that had changed and such reports were being taken more seriously, particularly in the context of political figures, because of growing hostility towards them.

But her response did not satisfy Cooper, who asked about specific antisemitic and racist tweets directed at MPs that had been raised the last time the social media companies had appeared before the committee. “What do we have to do to get you to take it down?” she asked.

Cooper moved on to the case of a National Action propaganda video that featured footage of a march in Darlington in 2016. Cooper said it had taken eight months to get the video removed from YouTube despite her repeatedly raising the matter, including with the company’s chief executive.

Lundblad apologised for it taking so long and said there had been a problem with the way the video was renamed. He said 135 National Action videos had been removed, some within a few hours of appearing and with as few as five views: “We will be closing the gap with the help of machines,” he said.

Cooper also complained that the companies’ algorithms were promoting violent extremism through their recommendations to users based on their past searches. She said she had been recommended other white supremacist videos to watch after viewing the National Action video. On Twitter, she had been recommended to follow Britain First’s accounts before they were suspended.

Nicklas Lundblad, a Google vice-president, told the MPs that YouTube recognised there was a problem that people could end up in “a bubble of hate”, and it was changing its technology to limit recommendations.

Facebook and YouTube, pressed on why they had not yet followed Twitter’s lead and banned Britain First from their platforms, both responded that they were considering such a move. But Facebook’s Simon Milner said Britain First was a registered political party and the company had to be “very cautious” when it came to banning political speech.