The roles and responsibilities of online intermediaries
Why focusing on the roles and responsibilities of online intermediaries?
Because in many Western countries, we are constantly trying to regulate the activities of these actors. For example, they are asked to control the content they put online: Fake news, harassment, apology for terrorism and, so many types of messages that have drawn attention to the messengers and highlighted the interests that some might have, to convey content rather than others.
The prosecution of the wrongdoer may be totally illusory or ineffective, either because it is impossible to identify or apprehend him in a distant country, or because he is clearly insolvent. It is in this context that legal actions were brought against the “intermediaries”, generally closer to the plaintiff and more solvent.
Who are these intermediaries?
In general, intermediaries can be considered as services that host, transmit, index or provide access to content, products and services created by third parties on the Internet. For example, this may include social media platforms such as Facebook and Twitter, blogging platforms and discussion forums, but also potentially search engines such as Google.
What is intermediate responsibility?
Intermediaries’ liability refers to an approach to Internet regulation in which online intermediaries are held legally responsible for the illegal or harmful content created by their users.
France has opted for a quasi-immunity of these intermediaries
Article 6-I-2 of the Law regarding Trust in the Digital Economy (LCEN) provides that the hosts “can not see their civil liability incurred because of activities or information stored at the request of ‘a recipient of those services if they did not have actual knowledge of their unlawful character or of facts and circumstances showing that character or if, from the moment when they had that knowledge, they acted promptly to withdraw such data or to make access impossible”. The host must, when alerted to illegal content, act promptly to remove the content or make access impossible. The GAFA can be prosecuted, but one has to overcome a discouraging complexity. They continue to benefit from a weakened liability regime.
Germany’s choice for effective liability
A German law, which came into force on January 1, 2018, has taken an extreme step by requiring social networks to remove within 24 hours the contents “manifestly illegal”, otherwise they will have to pay a fine worth 50 million Euros. Immediately, the first enforcement that was made illustrated the difficulties raised by such a system: Twitter deleted the content of a post from a member of the far-right party (AFD) and temporarily suspended his account, which provoked a debate on the actual illicit nature of the message and the legitimacy of censorship. The risk of infringement of freedom of expression is a major issue and raises many criticisms, many warned that forcing intermediaries to filter their content would inevitably lead to a situation in which platforms would become unofficial arbitrators of freedom of expression online, and determining the limits of acceptable speech and removing any content that they felt was offensive. Overzealous filtering can lead to censorship and deletion of perfectly legal content. The introduction of a stricter liability regime would potentially lead to the suppression of a growing number of legitimate content by platforms and services fearing for their own legal risk.
The European Commission plans to reform the 2000 e-commerce Directive. She is asking whether to switch from a “notice and take down” system to a more demanding “notice and stay down” system, which requires filtering to ensure that content already removed is not available online. It is fundamental though, should it be only in terms of costs incurred by the hosts and therefore the cost of access to the market for small players who can not always afford the filtering tools, if they are democratically justified.