Social media firms must face heavy fines over extremist content – MPs

An inquiry by the Commons home affairs committee condemns technology companies for failing to tackle hate speech

A computer screen showing the YouTube site

YouTube, which is owned by Google, has come under fire for failing to prevent paid adverts from appearing next to extremist videos.
Photograph: Richard Vogel/AP

Social media firms must face heavy fines over extremist content – MPs

An inquiry by the Commons home affairs committee condemns technology companies for failing to tackle hate speech

Social media companies are putting profit before safety and should face fines of tens of millions of pounds for failing to remove extremist and hate crime material promptly from their websites, MPs have said.

The largest and richest technology firms are “shamefully far” from taking action to tackle illegal and dangerous content, according to a report by the Commons home affairs committee.

The inquiry, launched last year following the murder of the Labour MP Jo Cox by a far-right gunman, concludes that social media multinationals are more concerned with commercial risks than public protection. Swift action is taken to remove content found to infringe copyright rules, the MPs note, but a “laissez-faire” approach is adopted when it involves hateful or illegal content.

Referring to Google’s failure to prevent paid advertising from reputable companies appearing next to YouTube videos posted by extremists, the committee’s report said: “One of the world’s largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue.”

In Germany, the report points out, the justice ministry has proposed imposing financial penalties of up to €50m on social media companies that are slow to remove illegal content.

“Social media companies currently face almost no penalties for failing to remove illegal content,” the MPs conclude. “We recommend that the government consult on a system of escalating sanctions, to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”

During its investigation, the committee found instances of terror recruitment videos for banned jihadi and neo-Nazi groups remaining accessible online even after MPs had complained about them.

Some of the material included antisemitic, hate-crime attacks on MPs that had been the subject of a previous committee report. Material encouraging child abuse and sexual images of children was also not removed, despite being reported on by journalists.

Social media companies that fail to proactively search for and remove illegal content should pay towards costs of the police doing so, the report recommends, just as football clubs are obliged to pay for policing in their stadiums and surrounding areas on match days.

The government, the report says, should consider whether failure to remove illegal material is in itself a crime and, if not, how the law should be strengthened. The thrust of the committee’s arguments suggest social media companies need to be treated as though they are traditional publishers.

Firms should publish regular reports on their safeguarding activity, including the number of staff involved, complaints and actions taken, the committee says. It is “completely irresponsible” that social media companies are failing to tackle illegal and dangerous content and to implement even their own community standards, the report adds.

A thorough review is required of the legal framework controlling online hate speech, abuse and extremism to ensure that the law is up to date, the MPs conclude. “What is illegal offline should be illegal – and enforced – online.”

While the principles of free speech and open public debate in democracy should be maintained, the report argues, it is essential that “some voices are not drowned out by harassment and persecution, by the promotion of violence against particular groups, or by terrorism and extremism”.

Yvette Cooper, the Labour MP who chairs the home affairs committee, said: “Social media companies’ failure to deal with illegal and dangerous material online is a disgrace.

“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.



Man accused of posting murder footage on Facebook kills himself

“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe …

“It is blindingly obvious that they have a responsibility to proactively search their platforms for illegal content, particularly when it comes to terrorist organisations.”

Google, the parent company of YouTube, told the inquiry that it has plans to extend its “trusted flagger” programme to identify terrorist propaganda and would invest in improving its alert procedures. It said that it “no interest” in making money from extremist material.

Facebook also told MPs that it is is reviewing how it handles violent videos and other objectionable material after a video of a murder in the United States remained on its service for more than two hours.

Google, Facebook and Twitter all refused to tell the committee how many staff they employ to monitor and remove inappropriate content.

Go to Source