Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-28T15:24:30.507Z Has data issue: false hasContentIssue false

AN ACCURATE THUMBNAIL OF EUROPEAN DATA PROTECTION AND SEARCH ENGINE INDEXING?

Published online by Cambridge University Press:  21 September 2023

Abstract

Type
Case and Comment
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Faculty of Law, University of Cambridge

ON 8 December 2022 the Court of Justice of the EU handed down TU, RE v Google LLC (C-460/20, EU:C:2022:962), a judgment exploring the critical issue of what data protection obligations search engines have when indexing potentially inaccurate data and additionally their obligations under the same law as regards thumbnail images. This case was the fourth in a series of Grand Chamber interventions on search engine indexing and data protection which have all taken as their starting point the analysis found in the seminal Google Spain judgment (Judgment of 13 May 2014, Google Spain, C-403/03, EU:C:2014:317) which to date has unlocked the possibility of data protection remedies in this context for many hundreds of thousands of individuals. TU, RE was especially careful to build on the understandings set out in GC et. al. v CNIL (Judgment of 24 September 29, GC and Others v Commission nationale de l'informatique et des libertés, C-136/17, EU:C:2019:773) which examined search engine indexing as regards the sensitive data rules as well as related duties when such data could reasonably be considered out-of-date. Nevertheless, as the significantly different approaches found in the judgment itself and the Advocate General Opinion highlight, the application of such an analysis and understandings are by no means straightforward. Developments to date should raise legitimate concerns that not only have the judgments here amounted to setting out a lex specialis from the bench but that this exercise has been insufficiently tethered to a detailed exploration of the structure and norms found in European data protection instruments.

TU, RE arose from claims lodged with Google in 2015 by two business associates for delisting from name-based search three articles published by G-LLC which criticised the investment model implemented by their group of companies and also thumbnail images showing them living a luxury lifestyle which had been derived from the G-LLC’s site (at [17]). Each individual alleged that the articles contained inaccurate claims and defamatory opinions and furthermore that they had been victims of “blackmail” by G-LLC (at [20]). G-LLC had been publicly accused of a general practice of publishing negative reviews on companies and then offering to remove them in exchange for a sum of money (at [18]). Google refused delisting citing the professional context of the material and arguing that it was unaware of the alleged inaccuracy. The applicants took Google to court but were unsuccessful in the Cologne Regional Court (Landgericht Köln) on 22 November 2017 and in the appeal judgment of the Cologne Higher Regional Court (Oberlandesgericht Köln) on 8 November 2018 (at [22]–[23]). By this time all relevant material had (at least temporarily) ceased to be available on the Internet (at [19]). Nevertheless, the applicants appealed on a point of law before the German Federal Court of Justice (Bundesgerichtshof) which decided to make reference to the Court of Justice on 27 July 2020, an application which Google sought to have declared inadmissible.

Advocate General Pitruzzella’s Opinion (EU:C:2022:271), handed down 7 April 2022, urged dismissal of Google’s admissibility challenges on the grounds that it had not been shown that the questions referred were merely hypothetical (at [22], [52]). Turning to the delisting of the articles, Pitruzzella adopted Google Spain’s crucial limitations of a search engine’s direct duties to situations where it significantly and additionally affects fundamental rights and to acting only within the framework of its “responsibilities, powers and capabilities” (at [10]), as well as GC et. al.’s specification of the latter as regards sensitive data rules as action “via a verification, under the supervision of the competent national authorities, on the basis of a request by the data subject” (at [15]). No mention was made of GC et al.’s wider construction of duties as regards the data protection principles themselves as requiring concrete action “at the latest on the occasion of the request for de-referencing” (EU:C:2019:773, at [78]). Stress was placed on the need to ensure a balance between conflicting rights and, in particular, on Article 17(3)(a) of the General Data Protection Regulation (GDPR) 2016/679 (OJ 2016 L 119/1) which disables the right to erasure “to the extent that processing is necessary for exercising the right to freedom of expression and information”. Pitruzzella also found that given their “professional context” and “the importance of information to investigators given the high-risk sector in which the applications operate” delisting would only be justified if contested information were false (EU:C:2022:271, at [23]) and it would be unacceptable, as the applicants proposed, for this to be presumed on the basis of a mere unilateral claim (without evidence) by the data subject (at [38]). Nevertheless, he rejected both Google’s claim that inaccuracy claims were ones for original publishers alone (at [39]) and a purely reactive understanding of a search engine’s obligations. Instead, although Pitruzzella did find that a data subject would need to provide prima facie evidence of falsity where this was not manifestly impossible or excessively difficult, a search engine in receipt of a valid but inconclusive claim was held to need to carry out checks including, where possible, initiating “rapidly an adversarial debate with the web publisher who initially disseminated the information” (at [45]). A final decision needed to be accompanied by brief reasons and a refusal to de-list would only be possible where either “substantial doubts” remained as to falsity or “the weight of the false information in the context of the publication in question [was] manifestly insignificant and that information [was] not a sensitive nature” (at [46]). Lastly, Pitruzzella also found that where indicated to “avoid irreparable harm”, a search engine would need as an interim measure to suspend referencing or indicate that that the truth of some of the information was contested (at [48]). Turning to the thumbnails, Pitruzzella argued that even when accompanied by a link to the original source, such images had been stripped from their original context and the search engine “appears to act not only as an intermediary but rather as a content creator” (at [55]). He therefore argued that a delisting decision should take no account of original context (at [60]) and furthermore stressed the particular substantive importance of the right to the protection of privacy as regards images (at [95]).

The court’s judgment (EU:C:2022:962) agreed with the Advocate General on admissibility (at [47], [87]) and with his answer and substantive analysis regarding thumbnails (albeit noting that any text directly accompanying any image should be taken into consideration) (at [108]). As regards the articles, the court similarly adopted the limitations and specification laid out in Google Spain and GC et al. (at [53]) and accepted that a data subject claiming inaccuracy could only be obliged to produce evidence “reasonably required of him or her” which could not extend to requiring (even an interim) judicial decision against the original publisher (at [68]). It nevertheless disputed that a search engine could be obliged carry out any active investigation. It followed that a search engine would only be obliged to de-list for inaccuracy where the evidence provided made this obvious (at [73]) and concerned information “not minor in relation to that content as a whole” or where a judicial decision made against the original publisher held such information to be “at least prima facie, inaccurate” (at [72]). In addition, where a search engine had been made aware that administrative or judicial proceedings concerning alleged inaccuracy had been initiated, it would be obliged to add a “warning concerning the existence of such proceedings” to the results (at [76]).

This case vividly highlights how the case law here is in effect judicially establishing a lex specialis and how contestable and generally uncertain the contours of this are. Google Spain itself failed to recognise that its crucial limitations, namely of significant and additional effect and even then action limited by responsibilities, powers and capabilities, are in fundamental conflict with the European data protection’s broad definition of controller and the peremptory duties which thereby follow. The GDPR wording especially regarding data protection by design (art. 25) and prior impact assessment (art. 35) have exacerbated these conflicts. Whilst it is imperative that law is construed in a way which ensures a balance between conflicting rights, legislative provisions should presumptively be applied as written. In contrast, the court’s judgment does great violence to the GDPR scheme as, even in relation to ex post data subject rights, it cites provisions only selectively and without detailed analysis. Thus, whereas emphasis is given to the broadly worded exemption from the right to erasure (art. 17(3)(a)), it is not acknowledged that the right to rectification of inaccurate data (art. 16) and the right to object (art. 21) apply irrespective of this or that the right to restrict processing is explicitly triggered wherever “the accuracy of personal data is contested by the data subject, for a period enabling the controller to verify the accuracy of the personal data” (art. 18(1)(a)). A hugely resource-rich entity like Google should anyway not find investigative duties an excessive “burden” (at [71]) and it is unclear why when accuracy is reasonably contested the “right to have incomplete personal data completed” should apply only when there are ongoing proceedings. In the latter context, when significant inaccuracy is proved and more generally as regards thumbnail images the judgment at least preserves the essence of the data protection principles. Future case law should build on this by carefully applying the substantive requirements set out in the GDPR’s restrictions clause (art. 23) when establishing any normative derogations. The judgments reach should anyway not be exaggerated as it applies only when inaccuracy would be the sole basis for delisting (and even then it is acknowledged that judicial or administrative proceedings may order a different result (at [75])). A more exacting approach remains necessary where questionable accuracy is but one element favouring delisting alongside other factors such as the presence of sensitive data, a non-public figure and the passage of time.