Online terrorist content removal rules agreed

online terrorist content removal
© iStock/the-lightwriter

The European Parliament has approved measures compelling online content hosts to remove terrorist content within an hour of its appearance.

The proposal, aimed at preventing radicalisation and boosting public security, would see internet platforms which “systematically and persistently” fail to remove terrorist content face sanctions by up to four per cent of their annual global turnover. Content hosts which operate in the EU, such as Facebook or YouTube, will receive removal orders from the relevant national authority when user-uploaded content which is defined as terroristic has been detected; they will then have one hour from receipt of the order to either remove the content or disable access to it for EU-based users.

In a measure aimed at helping smaller hosting platforms, competent authorities will be encouraged to contact companies which have never previously received an online terrorist content removal order at least 12 hours before issuing their first order, in order to provide them with information on best practice procedures and removal deadlines. MEPs agreed that content shared for educational, research and reporting purposes could be exempt from the restrictions; and sought to distinguish between content glorifying or promoting terrorism and content expressing “polemic or controversial” opinions. Platforms will not be expected to systematically monitor the content shared on their platforms, nor to actively search or investigate offending content.

Rapporteur Daniel Dalton said: “There is clearly a problem with terrorist material circulating unchecked on the internet for too long. This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively. Any new legislation must be practical and proportionate if we are to safeguard free speech. Without a fair process, there is a risk that too much content would be removed, as businesses would understandably take a ‘safety first’ approach to defend themselves. It also absolutely cannot lead to a general monitoring of content by the back door.”

  • LinkedIn
  • Twitter
  • Facebook

LEAVE A REPLY

Please enter your comment!
Please enter your name here