The EU will propose a law that will force social media platforms to remove terrorist content within 1 hour or face fines
The Financial Times reported on Sunday that a new regulation will take place in the EU where social media websites will be obligated to remove extremist, or any other terror-related content from their platforms in less than 60 minutes or they will have to pay fines.
There have been talks about these new regulations back in March but the proposed terms then were a lot more flexible and came more in a form of recommendation. Julian King, the EU’s commissioner for security explained that these new rules will help improve efficiency in content removal and will affect all websites regardless of their size and number of users.
Last year’s report published by the UK policy think tank Policy Exchange titled “The New Netwar” was probably a wake-up call for EU politicians. It explained how and with what intensity are terrorists using social media platforms.
“While Telegram exists as a ‘safe haven’ for jihadists, they have not abandoned other platforms such as Twitter, Facebook and YouTube. Twitter accounts for 40% of the identifiable traffic to jihadist content online.” Most of these platforms have been trying to use AI as a removal tool but, as reported earlier, that didn’t bring the expected results and once the 60-minute-rule is in place big social media websites will be in more trouble than they would want to. On the other hand, this added pressure will probably speed up the development process and with the help of some third-party platforms, they just might meet the deadline every single time.
A senior EU official has told the Financial Times that while these new regulations are still just a draft they will most likely contain the 1-hour deadline. User reports won’t count but if content is flagged by a law enforcement and is not removed within 60 minutes fines will have to be paid.
EU officials clearly aren’t happy with the way these tech giants are handling extremist content. “We cannot afford to relax or become complacent in the face of such a shadowy and destructive phenomenon. The difference in size and resources means platforms have differing capabilities to act against terrorist content and their policies for doing so are not always transparent. All this leads to such content continuing to proliferate across the internet, reappearing once deleted and spreading from platform to platform,” Julian King, the EU’s commissioner for security, told the Times. Pressure is definitely piling up and social networks will have to work now more than ever towards a common goal. Facebook has recently admitted that they have a lot of content that was posted a long time ago but is getting removed retroactively.
“We remove not just new content but old material too. We have built specialized techniques to surface and remove older content. Of the terrorism-related content we removed in Q1 2018, more than 600,000 pieces were identified through these mechanisms. We intend to do this for more content in the future. In Q1 2018 our historically focused technology found content that had been on Facebook for a median time of 970 days. (From a measurement perspective, we do not think a combined measure from both the contemporary tools and those designed to find old content is particularly useful for understanding the situation today, but for curious readers it’s about 52 hours.)” As you can see, these numbers will have to be brought down by a lot if this social media giant is to meet the EU regulation deadline of just one hour.