The European Commission is demanding social media platforms share illegal content with police amid broader threats of imposing EU-wide legislation to enforce the takedown of such material.
In a closed-door meeting in Brussels on Tuesday (9 January) between several EU commissioners and some 20 firms, the commission also demanded swifter removals. EU home affairs commissioner Dimitris Avramopolous said removals should not take more than two hours.
“Information on removed content should be shared with law enforcement so it can be used as evidence in investigations,” he said, along side commissioners Andrus Ansip, Elzbieta Bienkowska, Vera Jourova, Julian King and Mariya Gabriel. Their demand is part of a broader effort by the EU to coerce the firms to take swifter action against online content deemed to incite hatred, violence and terrorism.
But what constitutes such an incitement to hatred, violence and terrorism is often open to interpretation. The EU commission, in a paper last September, said “what is illegal offline is also illegal online” although national rules may differ on similar content. Internet platforms will have to look at individual member state rules, understand and apply their respective cases laws on things like hate speech, and then decide if the content should be removed.
Europol, the EU police agency, already has a special unit designed to refer the content to internet service providers. But figures on how many referrals led to investigations is unknown given that the agency does not keep such data. Critics say US firms at risk of fines are instead more likely to remove questionable content regardless of whether it is actually illegal.
“Facebook and Twitter will censor legal material because they are scared of fines,” the London-based NGO, Open Rights Group, told the BBC in December. The firms, which broadly oppose having to police the web, instead want clear regulations, according to EU justice commissioner Jourova.
“Many of them [Silicon Valley] told me that we do not feel ourselves comfortable by being those who decide on this,” she told reporters in September.
8,000 tweets per second
The task is large. Every second around 8,000 messages are posted on Twitter. Over 72,000 people per second view a YouTube video. Another 820 images per second are also uploaded to Instagram. Earlier this month, Titanic, a German satirical magazine had its Twitter account banned after poking fun at a right-wing AfD member, which poses questions on freedom of expression.
Twitter, along with other social media platforms, now face a potential €50 million fine in Germany since the start of the year if content that violates German hate speech laws is not removed within 24 hours. Last September, the French ministry of interior had also ordered two Indymedia websites to remove content deemed a “provocation to terrorism”. The same content had been published in more mainstream media outlets but without recrimination from the ministry.
But EU security commissioner King, in a tweet, said more action is needed by the companies. “Today we discussed, with industry, the need for faster action. If possible on a voluntary basis – but, if necessary we’ll look at further steps,” he said.
Last year’s commission’s paper on illegal content, known as a communication, was widely criticised by MEPs who said its ideas on automatic detection of bad content risked undermining rule of law. The paper also followed the adaption of an EU terrorism directive, which leaves concepts like the definition of indirect incitement to terrorism open to national interpretation.
“The respect for human rights is not an obstacle to security, it is a route towards stronger and better security,” Michael O’Flaherty, the director of the EU Fundamental Rights Agency, told MEPs on Monday.