On November 6, a Paris Court ordered Google to filter out hyperlinks to images of former F1 boss Max Mosley in an allegedly Nazi-themed sadomasochistic orgy. (TGI Paris, 17e ch., 6 novembre 2013, RG 11/07970, Max Mosley c. Google Inc et Google France)
This judgment is situated on the crossroads of privacy rights, freedom of information and cooperation duties of internet intermediaries. A soon to be expected European Court of Justice judgment in a similar case may bring more clarity in finding the right balance between these rights and values.
THE SADOMASOCHISTIC ORGY
Several years ago, former F1 boss Max Mosley was filmed during a sadomasochistic orgy. During legal proceedings that followed the publication of this film on the “News of The World” (NoTW) website, Mr. Mosley successfully demanded the newspaper to remove the film due to privacy rights.
However, several images of this event (copied from the video fragment) were distributed all across the Internet and could e.g. still be accessed through Google, the world’s biggest search engine. As a result of this, Mr. Mosley sued the search engine in different countries in an attempt to order the search engine to make these pictures inaccessible.
In a ruling this week, the “Tribunal de grande instance” of Paris has effectively ordered Google France to make the photographs concerned inaccessible. The court based its decision on the right of privacy and the right to be forgotten.
Regardless of whether the Streisand effect (trying to ban content from the Internet may make people more likely to share it) will play in this matter, the judgment invites to several legal and political discussions.
This decision follows an ever growing European trend of requiring internet intermediaries (ISPs, search engines, domain name registries, hosting providers, etc.) to make certain information inaccessible. Plaintiffs see it as more attractive to attack Internet intermediaries directly, as bringing action against the actual distributors of certain content may bring along certain barriers due to the cross-border nature of the Internet (difficulties in practice to trace the persons responsible for Internet content, legal and practical barriers to bring such a person before court or to enforce a possible ruling, lack of harmonization in legislation, etc.).
This judgment is interesting from several points of view. First, it touches upon the balance between the right of privacy, on the one hand, and the freedom of expression, on the other hand. Secondly, it deals with the obligations that can be imposed on Internet intermediaries, including Internet search engines, and whether and if so, to which extent, these should filter information and block content.
PRIVACY IMPACT
The right of privacy (and the right to be forgotten resulting from it) is regulated by the European Directive 95/46 (which may be replaced in the coming years by a Regulation) resulting in similar privacy legislations in the EU Member States. This legislation stipulates that personal data which is no longer relevant should be erased or anonymized (“right to be forgotten”, which the new Regulation, when adopted, will likely further detail). “This case is interesting in the sense that Mr. Mosley already successfully claimed to remove the video of the NoTW website, hence “to be forgotten”, but is now confronted with the fact that images of this video have gone viral, or in other words: “the Internet never forgets!”
From a privacy law point of view, Google is not a controller of personal data, but only a processor acting on the instruction of other entities. Only controllers must erase or anonymize the personal data when it is no longer relevant, hence Google could claim it does not even have the right on the basis of privacy laws to erase or make inaccessible these photos and that Mr. Mosley should direct its claim towards the actual distributors of the photographs (which, as we have seen above, poses several legal and practical barriers).
According to the French court, the right of the individual prevails and the Internet intermediary should take appropriate action. A right balance between the right to be forgotten, on the one hand, and an obligation for search engines to erase or make inaccessible certain information, on the other hand, is a difficult one. It raises the question where we want to take this as a society, and it is very likely that many more court cases on this subject will following in the coming years.
A decision in one of those cases, which may turn out to become the landmark case in this matter, is likely to be passed in the following few weeks by the European Court of Justice (ECJ). In this case, Google Spain is involved in a similar matter, and the outcome of this case will likely set out the benchmark for following national and European rulings. Although the ECJ has not yet rendered its decision, its Advocate-General (AG) has issued his opinion according to which “The rights to erasure and blocking of data, provided for in Directive 95/46, do not confer on the data subject a right to address himself to a search engine service provider in order to prevent indexing of the information relating to him personally, published legally on third parties’ web pages, invoking his wish that such information should not be known to internet users when he considers that it might be prejudicial to him or he wishes it to be consigned to oblivion“. As a result, “the conclusion of the AG is the exact opposite of the recent ruling in France. Although it is unsure whether the ECJ will follow the AG’s conclusion, this indicates the difficulty of the discussion.”
This discussion is also closely connected to the question to which extent Internet intermediaries can be obliged to proactively search for and block illegal content. European rules also exist on this topic, more in particular in the legislation on electronic commerce.
FREEDOM OF INFORMATION AND EXPRESSION: LIABILITY AND MONITORING
As is the case with privacy regulations, several rules on electronic commerce are regulated by a Directive (2000/31) in the European Union, hence all EU Member States know similar legal provisions in this matter. One of the legal provisions specifically relates to the liability of Internet intermediaries (mere conduit, caching and hosting providers), and to which extent these should proactively monitor illegal or harmful content. These legal provisions were introduced from the concern that these Internet intermediaries operate on the level of access to information, but not on that of the content of the information as such.
There is no doubt that Google is an Internet intermediary in the sense of the legislation on electronic commerce. As a result, Google falls under the scope of the legal provisions which state that such a service provider cannot be held liable for illegal content which is being transmitted or temporarily stored through its website, as long as the activity of the service provider is of a “mere technical, automatic and passive” nature which implies that Google has neither knowledge of nor control over the information which is transmitted or stored. As a result, Google cannot be held liable for illegal content, unless it does not immediately erase or make inaccessible such content after having gained knowledge of the illegal nature of the content. It should be noted however that it is very unclear as well to which extent Google should comply with a “notice and takedown” obligation, as a mere notification does not in itself mean that the content is effectively illegal.
Additionally, no monitoring obligation exists for Internet intermediaries, which means that there is no general obligation to monitor the information transmitted or stored, nor is there a general obligation actively to seek facts or circumstances indicating illegal activity. Conversely, it is possible to impose a temporary monitoring obligation in specific circumstances, in case the applicable law foresees such a possibility. The legislator expressly included this limited possibility to safeguard freedom of information and expression, as it feared that national legislators would impose all kinds of monitoring obligations on Internet intermediaries.
In practice, it is not easy to draw the line between a general monitoring obligation, which is prohibited, and a specific monitoring obligation (allowed under certain circumstances). In each case, the ECJ ruled on several occasions (Scarlet and Netlog cases) that it is prohibited for a national legislator to allow that Internet intermediaries are imposed to install a filter which proactively, and without limitation in time, searches for possible illegal content.
In the aftermath of the French ruling, the question thus arises whether the order of the French court is sufficiently specific, and additionally, whether the applicable law provides the possibility to impose such a measure.
WHAT’S NEXT?
The above discussion indicates that it is difficult to erase information once it has gone viral. Even when the source that once published the information, no longer do so, the content may still turn up somewhere.
Next, the French decision urges to a public debate on the question to which extent we allow search engine results to be filtered, and whether these engines should not be neutral conduits of Internet information.
The soon to be expected ECJ decision in the Google Spain case may bring some further clarification on how to apply this difficult balance.