Article contents
Governing Fake News: The Regulation of Social Media and the Right to Freedom of Expression in the Era of Emergency
Published online by Cambridge University Press: 11 October 2021
Abstract
Governments around the world are strictly regulating information on social media in the interests of addressing fake news. There is, however, a risk that the uncontrolled spread of information could increase the adverse effects of the COVID-19 health emergency through the influence of false and misleading news. Yet governments may well use health emergency regulation as a pretext for implementing draconian restrictions on the right to freedom of expression, as well as increasing social media censorship (ie chilling effects). This article seeks to challenge the stringent legislative and administrative measures governments have recently put in place in order to analyse their negative implications for the right to freedom of expression and to suggest different regulatory approaches in the context of public law. These controversial government policies are discussed in order to clarify why freedom of expression cannot be allowed to be jeopardised in the process of trying to manage fake news. Firstly, an analysis of the legal definition of fake news in academia is presented in order to establish the essential characteristics of the phenomenon (Section II). Secondly, the legislative and administrative measures implemented by governments at both international (Section III) and European Union (EU) levels (Section IV) are assessed, showing how they may undermine a core human right by curtailing freedom of expression. Then, starting from the premise of social media as a “watchdog” of democracy and moving on to the contention that fake news is a phenomenon of “mature” democracy, the article argues that public law already protects freedom of expression and ensures its effectiveness at the international and EU levels through some fundamental rules (Section V). There follows a discussion of the key regulatory approaches, and, as alternatives to government intervention, self-regulation and especially empowering users are proposed as strategies to effectively manage fake news by mitigating the risks of undue interference by regulators in the right to freedom of expression (Section VI). The article concludes by offering some remarks on the proposed solution and in particular by recommending the implementation of reliability ratings on social media platforms (Section VII).
- Type
- Articles
- Information
- Copyright
- © The Author(s), 2021. Published by Cambridge University Press
Footnotes
I wish to acknowledge the anonymous reviewers for valuable comments and suggestions on an earlier draft of this article. This article presents the results of the annual research I carried out thanks to the research fellowship from the University of Turin – Department of Law (supervisor Dr Valeria Ferraris). The first draft of the article was inspired by the international conference “Covid 19, pandémies et crises sanitaires, quels apports des sciences humaines et sociales?” held on 16–17 December 2020 at the École Supérieure de technologie d’Essaouira. I wish to thank the coordinator of the conference, Professor Mohamed Boukherouk, for inviting me to that important event, and I am also grateful for suggestions and comments from colleagues at the international conference: Mélissa Moriceau, Hilaire Akerekoro, Abdelkader Behtane, Miterand Lienou and Yanick Hypolitte Zambo.
References
1 U Eco, The Prague Cemetery, tr. R Dixon (London, Harvill Secker/New York, Houghton Mifflin 2011).
2 See, eg, A Renda, “The legal framework to address ‘fake news’: possible policy actions at the EU level” (2018) European Parliament, 10–18 also available online at <https://www.europarl.europa.eu/RegData/etudes/IDAN/2018/619013/IPOL_IDA(2018)619013_EN.pdf>.
3 The term “fake news” may be widely recognised in public debate, but some academic and above all policy sources generally advise against using it, recommending “disinformation” instead. While “misinformation” refers to material that is simply erroneous (eg due to error or ignorance), “disinformation” implies an intentional, malicious attempt to mislead.
4 DO Klein and JR Wueller, “Fake news: a legal perspective” (2017) 20(10) Journal of Internet Law 6, also available at SSRN: <https://ssrn.com/abstract=2958790>.
5 M Verstraete et al, “Identifying and countering fake news” (2017) Arizona Legal Studies Discussion Paper no. 17-15, 5–9.
6 B Baade, “Fake news and international law” (2018) 29(4) European Journal of International Law 1358 <https://doi.org/10.1093/ejil/chy071>.
7 C Calvert, S McNeff, A Vining and S Zarate, “Fake news and the First Amendment: reconciling a disconnect between theory and doctrine” (2018) 86 (99) University of Cincinnati Law Review 103 <https://scholarship.law.uc.edu/uclr/vol86/iss1/3>.
8 A Park and KH Youm, “Fake news from a legal perspective: the United States and South Korea compared” (2019) 25 Southwestern Journal of International Law 100–19, at 102.
9 T McGonagle, “Fake news: false fears or real concerns?” (2017) 35(4) Netherlands Quarterly of Human Rights 203–09 <https://doi.org/10.1177/0924051917738685>.
10 H Allcott and M Gentzkow, “Social media and fake news in the 2016 election” (2017) 31(2) Journal of Economic Perspectives 213. However, in defining this concept, they exclude satirical websites such as The Onion (<https://www.theonion.com>), which uses humour and exaggeration to criticise social and political issues. On this point, for a different opinion, see Klein and Wueller, supra, note 4, 6. See also A Bovet and HA Makse, “Influence of fake news in Twitter during the 2016 U.S. presidential election” (2019) 10 Nature Communications 7 <https://doi.org/10.1038/s41467-018-07761-2>.
11 See Alcott and Gentzkow, supra, note 10, 5.
12 N Levy, “The bad news about fake news” (2017) 6(8) Social Epistemology Review and Reply Collective 20 <http://wp.me/p1Bfg0-3GV>.
13 R Rini, “Fake news and partisan epistemology” (2017) 27(2) Kennedy Institute of Ethics Journal E43–E64 <10.1353/ken.2017.0025>.
14 A Gelfert, “Fake news: a definition” (2018) 38(1) Informal Logic 85 and 108 <https://doi.org/10.22329/il.v38i1.5068>, where “the phrase ‘by design’ is intended to reflect that what is novel about fake news – not only, but especially on electronic social media – is its systemic dimension”.
15 EC Tandoc Jr, WL Zheng and R Ling, “Defining ‘fake news’: a typology of scholarly definitions” (2018) 6(2) Digital Journalism 2.
16 “Fighting Fake News Workshop Report”, 3–4 <https://law.yale.edu/isp/initiatives/floyd-abrams-institute-freedom-expression/practitioner-scholar-conferences-first-amendment-topics/fighting-fake-news-workshop> (last accessed 9 November 2020).
17 First of all, the participants determined that the most salient danger associated with fake news is the fact that it devalues and delegitimises the voices of experts, authoritative institutions and the concept of objective data, all of which undermines society’s ability to engage in rational discourse based upon shared facts. Secondly, some argued that the difficulty of defining fake news raised the attendant risk of overly broad government regulation, while others worried that opening the door to permitting government sanctions against certain kinds of public discourse would grant the government too much power to control speech in areas of public concern.
18 European Commission, “Public Consultation on Fake News and Online Disinformation” (26 April 2018) <https://ec.europa.eu/digital-single-market/en/news/public-consultation-fake-news-and-online-disinformation> (last accessed 10 November 2020). The consultation will collect information on: (1) the definition of fake information and its spread online; (2) the assessment of measures already taken by platforms, news media companies and civil society organisations to counter the spread of fake information online; and (3) the scope for future actions to strengthen the quality of information and to prevent the spread of disinformation online.
19 European Commission, “Synopsis Report of the Public Consultation on Fake News and Online Disinformation” (26 April 2018), 6 <https://ec.europa.eu/digital-single-market/en/news/synopsis-report-public-consultation-fake-news-and-online-disinformation> (last accessed 10 November 2020).
20 ibid.
21 House of Commons Digital, Culture, Media and Sport Committee, “Disinformation and ‘Fake News’: Government response to the committee’s fifth report of session 2017–19”, hc 1630 (2018), 2 <https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1630/1630.pdf>, archived at <https://perma.cc/e92s-4ggc> (last accessed 11 November 2020). It should be noted that, to help provide clarity and consistency, the Digital, Culture, Media and Sport Committee recommended that the government not use the term “fake news” and instead use, and define, the words “misinformation” and “disinformation”. On this point, see C Feikert-Ahalt, “Initiatives to counter fake news in selected countries – United Kingdom” 100–08 <https://www.loc.gov/law/help/fake-news/index.php> (last accessed 11 November 2020).
22 See, eg, MR Leiser, “Regulating fake news” (2017), available at <https://openaccess.leidenuniv.nl/handle/1887/72154>.
23 For a definition of “chilling effects”, see L Pech, “The concept of chilling effect: its untapped potential to better protect democracy, the rule of law, and fundamental rights in the EU” opensocietyfoundations.org (4 March 2021) <https://www.opensocietyfoundations.org/uploads/c8c58ad3-fd6e-4b2d-99fa-d8864355b638/the-concept-of-chilling-effect-20210322.pdf> (last accessed 2 September 2021). According to Pech: “From a legal point of view, chilling effect may be defined as the negative effect any state action has on natural and/or legal persons, and which results in pre-emptively dissuading them from exercising their rights or fulfilling their professional obligations, for fear of being subject to formal state proceedings which could lead to sanctions or informal consequences such as threats, attacks or smear campaigns”. Furthermore, he said that “State action is understood in this context as any measure, practice or omission by public authorities which may deter natural and/or legal persons from exercising any of the rights provided to them under national, European and/or international law, or may discourage the potential fulfilment of one’s professional obligations (as in the case of judges, prosecutors and lawyers, for instance)”.
24 Amnesty International, report 2020/21, “The state of the world’s human rights” <https://www.amnesty.org/en/documents/pol10/3202/2021/en/> (last accessed 13 June 2021).
25 ibid, 16.
26 See Mandates of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression and the Special Rapporteur for freedom of expression of the Inter-American Commission on Human Rights, OL BRA 6/2020, 3 July 2020 <https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=25417> (last accessed 16 June 2021).
27 See UN Special Rapporteur, Report on disinformation, A/HRC/47/25, 13 April 2021, 12 <https://undocs.org/A/HRC/47/25> (last accessed 15 June 2021).
28 ibid
29 ibid.
30 See Twitter, COVID-19 misleading information policy <https://help.twitter.com/en/rules-and-policies/medical-misinformation-policy> (last accessed 16 June 2021).
31 See Facebook, Advertising policies related to coronavirus (COVID-19) <https://www.facebook.com/business/help/1123969894625935> (last accessed 16 June 2021).
32 See Facebook, Facebook’s Third-Party Fact-Checking Program <https://www.facebook.com/journalismproject/programs/third-party-fact-checking> (last accessed 16 June 2021).
33 See Twitter, Introducing Birdwatch, a community-based approach to misinformation <https://blog.twitter.com/en_us/topics/product/2021/introducing-birdwatch-a-community-based-approach-to-misinformation> (last accessed 16 June 2021).
34 For a statistical analysis also extending to criminal measures to combat fake news during the COVID-19 pandemic, see International Press Institute, COVID-19: Number of Media Freedom Violations by Region <https://ipi.media/covid19-media-freedom-monitoring/> (last accessed 14 June 2021).
35 See, eg, B Qin, D Strömberg and Y Wu, “Why does China allow freer social media? Protests versus surveillance and propaganda” (2017) 31(1) Journal of Economic Perspectives 117–40 <https://doi.org/10.1257/jep.31.L117>; L Guo, “China’s ‘fake news’ problem: exploring the spread of online rumors in the government-controlled news media” (2020) 8(8) Digital Journalism 992–1010 <https://doi.org/10.1080/21670811.2020.1766986>.
36 See Standing Committee of the National People’s Congress, Cybersecurity Law of the People’s Republic of China, Order No. 53 of the President, 11 July 2016 <http://www.lawinfochina.com/display.aspx?lib=law&id=22826#> (last accessed 11 November 2020). See S Reeves, R Alcala and E Gregory, “Fake news! China is a rule-of-law nation and respects international law” (2018) 39(4) Harvard International Review 42–46.
37 See, eg, L Khalil “Digital authoritarianism, China and covid”, Lowy Institute <https://www.lowyinstitute.org/publications/digital-authoritarianism-china-and-covid#_edn42> (last accessed 11 November 2020).
38 Forbes, “Report: China Delayed Releasing Vital Coronavirus Information, Despite Frustration from WHO” (2 June 2020) <https://www.forbes.com/sites/isabeltogoh/2020/06/02/report-china-delayed-releasing-vital-coronavirus-information-despite-frustration-from-who/> (last accessed 12 November 2020).
39 <https://www.hubei.gov.cn/zxjy/rdhy/202001/t20200101_1862068.shtml> (last accessed 10 November 2020).
40 <https://mp.weixin.qq.com/s/YhjV75NwJZO4CyPdepArQw> (last accessed 10 November 2020).
41 Cf. China: Protect Human Rights While Combatting Coronavirus Outbreak <https://mp.weixin.qq.com/s/3dYMFTlvXS-WuEJcZDytWw> (last accessed 12 November 2020).
42 Cf. S.1989 – Honest Ads Act, 19 October 2017 <https://www.congress.gov/bill/115th-congress/senate-bill/1989/text> (last accessed 11 November 2020).
43 R Kraski, “Combating fake news in social media: U.S. and German legal approaches” (2017) 91(4) St. John’s Law Review 923–55; see also Policy Report by G Haciyakupoglu, J Yang Hui, VS Suguna, D Leong and M Rahman, “Countering fake news: a survey of recent global initiatives” (2018) S Rajaratnam School of International Studies 3–22, available at <https://www.rsis.edu.sg/wp-content/uploads/2018/03/PR180416_Countering-Fake-News.pdf>.
44 “Global Engagement Center”, US Department of State: Diplomacy in Action <https://www.state.gov/r/gec/> (last accessed 11 November 2020).
45 See S.2943 – National Defense Authorization Act for Fiscal Year 2017, Congress.Gov <https://www.congress.gov/bill/114th-congress/senatebill/2943/text> (last accessed 11 November 2020).
46 See Senate of the State of California, SB-1424 Internet: social media: advisory group <https://leginfo.legislature.ca.gov/faces/billStatusClient.xhtml?bill_id=201720180SB1424> (last accessed 11 November 2020).
47 Cf. Stigler Committee on Digital Platforms Final Report, 2019 <https://www.publicknowledge.org/wp-content/uploads/2019/09/Stigler-Committee-on-Digital-Platforms-Final-Report.pdf> (last accessed 26 August 2021).
48 Cf. 2020 Economic Report of the President, 20 February 2020 <https://trumpwhitehouse.archives.gov/wp-content/uploads/2020/02/2020-Economic-Report-of-the-President-WHCEA.pdf> (last accessed 26 August 2021).
49 Cf. L Zingales and FM Lancieri, “The Trump Administration attacks the Stigler Report on Digital Platforms” <https://promarket.org/2020/02/21/the-trump-administration-attacks-the-stigler-report-on-digital-platforms/> (last accessed 26 August 2021).
50 Cf. N Chilson, “Creating a new federal agency to regulate Big Tech would be a disaster” <https://www.washingtonpost.com/outlook/2019/10/30/creating-new-federal-agency-regulate-big-tech-would-be-disaster/> (last accessed 26 August 2021).
51 Cf. L Zingales and F Scott Morton, “Why a new digital authority is necessary” <https://promarket.org/2019/11/08/why-a-new-digital-authority-is-necessary/> (last accessed 26 August 2021).
52 New York Times, “Pence will control all coronavirus messaging from health officials” (27 February 2020) <https://www.nytimes.com/2020/02/27/us/politics/us-coronavirus-pence.html> (last accessed 10 November 2020).
53 On Russian fake news legislation, see, eg, O Pollicino, “Fundamental rights as bycatch – Russia’s anti-fake news legislation” (VerfBlog, 28 March 2019) <https://verfassungsblog.de/fundamental-rights-as-bycatch-russias-anti-fake-news-legislation/,doi:10.17176/20190517-144352-0> (last accessed 12 November 2020).
54 See The criminal Code of the Russian Federation, Art 207.1 “Public dissemination of knowingly false information about circumstances that pose a threat to the life and safety of citizens” <https://rulaws.ru/uk/Razdel-IX/Glava-24/Statya-207.1/>.
55 ibid, Art 207.2 “Public dissemination of knowingly false socially significant information, which entailed grave consequences”.
56 See the UK House of Commons Digital, Culture, Media and Sport Committee, “Fake news” inquiry, 30 January 2017 <https://old.parliament.uk/business/committees/committees-a-z/commons-select/culture-media-and-sport-committee/news-parliament-2015/fake-news-launch-16-17/> (last accessed 12 November 2020).
57 See the House of Commons, Digital, Culture, Media and Sport Committee, Disinformation and “fake news”: Interim Report Fifth Report of Session 2017-19, HC 363, 29 July 2018 <https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/363.pdf> (last accessed 13 November 2020).
58 See the House of Commons, Digital, Culture, Media and Sport International Grand Committee Oral evidence: Disinformation and “fake news”, HC 363, 27 November 2018 <http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/digital-culture-media-and-sport-committee/disinformation-and-fake-news/oral/92923.pdf> (last accessed 13 November 2020).
59 See Members of the “International Grand Committee” on Disinformation and “Fake News”, the Declaration on the “Principles of the law governing the Internet”, 27 November 2018 <https://old.parliament.uk/business/committees/committees-a-z/commons-select/digital-culture-media-and-sport-committee/news/declaration-internet-17-19/> (last accessed 13 November 2020).
60 See the Parliament of Australia, Australian Electoral Commission, Striving to safeguard election, 2 March 2019 <https://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22media%2Fpressclp%2F6529492%22> (last accessed 18 November 2020).
61 See Australian Electoral Commission (AEC), Electoral Integrity Assurance Taskforce, (last update) 10 July 2020 <https://www.aec.gov.au/elections/electoral-advertising/electoral-integrity.htm> (last accessed 18 November 2020).
62 See the Parliament of Australia, Department of Home Affairs, ASIO, AFP call-up for by-elections, 9 June 2018 <https://parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;query=Id%3A%22media%2Fpressclp%2F6016585%22> (last accessed 18 November 2020).
63 See the Australian Government, National Security Legislation Amendment (Espionage and Foreign Interference) Act 2018 No. 67, 2018, 10 December 2018 <https://www.legislation.gov.au/Details/C2018C00506> (last accessed 18 November 2020).
64 See Australian Electoral Commission (AEC), Encouraging voters to “stop and consider” this federal election, (last update) 15 April 2019 <https://www.aec.gov.au/media/media-releases/2019/04-15.htm> (last accessed 18 November 2020).
65 See the Australian Competition & Consumer Commission (ACCC), Digital platforms inquiry, 4 December 2017 <https://www.accc.gov.au/focus-areas/inquiries-ongoing/digital-platforms-inquiry> (last accessed 18 November 2020).
66 See the Australian Competition & Consumer Commission (ACCC), Digital platforms inquiry, Preliminary Report, 10 December 2018 <https://www.accc.gov.au/focus-areas/inquiries-ongoing/digital-platforms-inquiry/preliminary-report> (last accessed 18 November 2020).
67 Cf. Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Act 2021, No 21/2021 <https://www.legislation.gov.au/Details/C2021A00021> (last accessed 27 August 2021).
68 See Standing Committee on Access to Information, Privacy and Ethics, Breach of personal information involving Cambridge Analytica and Facebook, Reports and Government Responses. On this point, two reports can be consulted: (1) Report 16: Addressing Digital Privacy Vulnerabilities and Potential Threats to Canada’s Democratic Electoral Process, 14 June 2018; and (2) Report 17, Democracy under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly, 6 December 2018 <https://www.ourcommons.ca/Committees/en/ETHI/StudyActivity?studyActivityId=10044891> (last accessed 23 November 2020).
69 See Committee on Access to Information, Privacy and Ethics, Report 16: Addressing digital privacy vulnerabilities and potential threats to Canada’s democratic electoral process, 14 June 2018 <https://www.ourcommons.ca/DocumentViewer/en/42-1/ETHI/report-16/> (last accessed 24 November 2020).
70 See Committee on Access to Information, Privacy and Ethics, Report 17, Democracy under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly, 6 December 2018 <https://www.ourcommons.ca/DocumentViewer/en/42-1/ETHI/report-17> (last accessed 25 November 2020).
71 See Assemblée Nationale, Modification de la loi 31 Mat 2018 No. 25 (Code Penal) <https://perma.cc/LQH4-72W4> (last accessed 28 November 2020).
72 See Republic of Singapore, Protection from Online Falsehoods and Manipulation Act 2019, 25 June 2019 <https://sso.agc.gov.sg/Acts-Supp/18-2019/Published/20190625?DocDate=20190625> (last accessed 28 November 2020). See M Zastrow, “Singapore passes ‘fake news’ law following researcher outcry” (Nature, 15 May 2019) <https://doi.org/10.1038/d41586-019-01542-7>. See also, for a comparative law perspective, K Han, “Big Brother’s regional ripple effect: Singapore’s recent ‘fake news’ law which gives ministers the right to ban content they do not like, may encourage other regimes in south-east Asia to follow suit” (2019) 48(2) Index on Censorship 67–69.
73 Amnesty International, supra, note 24, 184.
74 See J Kalbhenn and M Hemmert-Halswick, “EU-weite Vorgaben zur Content-Moderation auf sozialen Netzwerken” [EU-wide guidelines on content moderation on social networks] (2021) 3 ZUM – Zeitschrift für Urheber- und Medienrecht 184–94. See also N Gielen and S Uphues, “Regulierung von Markt- und Meinungsmacht durch die Europäische Union” [Regulation of market and opinion power by the European Union] (2021) 14 Europäische Zeitschrift für Wirtschaftsrecht 627–37; J Kühling “Fake news and hate speech – Die Verantwortung von Medienintermediären zwischen neuen NetzDG, MStV und DSA” [Fake news and hate speech – the responsibility of media intermediaries between new NetzDG, MStV and DSA] (2021) 6 ZUM – Zeitschrift für Urheber- und Medienrecht 461–72.
75 Cf. Medienstaatsvertrag (MStV), official German version, especially § 18 Abs. 3; § 19; § 84; § 93/94 <https://www.rlp.de/fileadmin/rlp-stk/pdf-Dateien/Medienpolitik/Medienstaatsvertrag.pdf> (last accessed 28 August 2021).
76 Cf. Directive 2010/13EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (last accessed 28 August 2021).
77 Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018, amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities <https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32018L1808&from=EN> (last accessed 28 August 2021).
78 Cf. Interstate Treaty on Broadcasting and Telemedia (Interstate Broadcasting Treaty) <https://www.die-medienanstalten.de/fileadmin/user_upload/Rechtsgrundlagen/Gesetze_Staatsvertraege/RStV_22_english_version_clean.pdf> (last accessed 28 August 2021).
79 In German media law scholarship, see the recent report by B Holznagel and JC Kalbhenn, “Monitoring media pluralism in the digital era application of the media pluralism monitor in the European Union, Albania, Montenegro, the Republic of North Macedonia, Serbia & Turkey in the year 2020 – country report: Germany, Robert Schuman Centre, Issue 2021.2823, July 2021, available online at <https://cadmus.eui.eu/bitstream/handle/1814/71947/germany_results_mpm_2021_cmpf.pdf?sequence=1> (last accessed 28 August 2021).
80 For Germany and disinformation, cf. The Federal Constitutional Court (Bundesverfassungsgericht) of 18 July 2018 strengthening online activities of public service media, 79–81 <https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/2018/07/rs20180718_1bvr167516en.html> (last accessed 28 August 2021).
81 Act to Improve the Enforcement of Rights on Social Networks, 1 September 2017 <http://www.gesetze-im-internet.de/netzdg/NetzDG.pdf> English version archived at <https://perma.cc/BAE2-KAJX> (last accessed 29 November 2020). On the NetzDG, see M Eifert, “Evaluation des NetzDG Im Auftrag des BMJV” <https://www.bmjv.de/SharedDocs/Downloads/DE/News/PM/090920_Juristisches_Gutachten_Netz.pdf?__blob=publicationFile&v=3> (last accessed 28 August 2021). See also the report by H Tworek and P Leerssen, “An analysis of Germany’s NetzDG law” (2019) Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression, available at <https://dare.uva.nl/search?identifier=3dc07e3e-a988-4f61-bb8c-388d903504a7>. See also the legal report by J Gesley, “Initiatives to counter fake news: Germany”, available at < https://www.loc.gov/law/help/fake-news/germany.php#_ftnref42> (last accessed 2 December 2020). In academia, see B Holznagel, “Das Compliance-System des Entwurfs des Netzwerk-durchsetzungsgesetzes – Eine kritische Bestandsaufnahme aus internationaler Sicht” [The Compliance System of the Draft Network Enforcement Act – A Critical Review from an International Perspective] (2017) 8/9 Zeitschrift für Urheber- und Medienrecht 615–24.
82 The amendment intends to enhance the information content and comparability of social media transparency reports and to increase the user-friendliness of reporting channels for complaints about illegal content. In addition, the amendment introduces an appeal procedure for measures taken by the social media company. The powers of the Federal Office of Justice are extended to encompass supervisory powers. Lastly, due to the new requirements of the EU Audiovisual Media Services Directive, the services of video-sharing platforms are included in the scope of the Network Enforcement Act. See Gesetz zur Änderung des Netzwerkdurchsetzungsgesetzes <https://perma.cc/9W8E-GSWM> (last accessed 11 September 2021).
83 See JC Kalbhenn and M Hemmert-Halswick, “Der Referentenentwurf zum NetzDG – Vom Compliance Ansatz zu Designvorgaben” [The draft bill on the NetzDG – from compliance approach to design requirements] (2020) 8 MMR – Zeitschrift für IT-Recht und Digitalisierung 518–22. See also S Niggemann, “Die NetzDG-Novelle – Eine Kritik mit Blick auf die Rechte der Nutzerinnen und Nutzer” [The NetzDG amendment – a critique with a view to users’ rights] (2021) 5 Computer und Recht 326–31; M Cornils, “Präzisierung, Vervollständigung und Erweiterung: Die Änderungen des Netzwerkdurchsetzungsgesetzes” [Completion and expansion: the amendments to the Network Enforcement Act] (2021) 34 Neue Juristische Wochenschrift 2465–71.
84 See, eg, V Claussen, “Fighting hate speech and fake news. The Network Enforcement Act (NetzDG) in Germany in the context of European legislation” (2018) 3 Media Laws 113–15, available online at <https://www.medialaws.eu/wp-content/uploads/2019/05/6.-Claussen.pdf> (last accessed 28 August 2021).
85 See Mandate of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, 1 June 2017, OL DEU 1/2017, 4 <https://www.ohchr.org/Documents/Issues/Opinion/Legislation/OL-DEU-1-2017.pdf> (last accessed 16 June 2021).
86 It should be noted that the NetzDG did not introduce new obligations but tightened the time frame within which criminal content must be deleted.
87 Regarding other States, on 30 March 2020, the Hungarian government issued a law making the spread of false or distorted information punishable with up to five years imprisonment and, on 4 May 2020, it issued a decree introducing derogations from the principle of the right to information; this is the Decree of 4 May 2020 <https://magyarkozlony.hu/dokumentumok/008772a9660e8ff51e7dd1f3d39ec056853ab26c/megtekintes>. The decree limits the application of data subjects’ rights safeguarded under Arts 15–22 of the GDPR in relation to the processing of data conducted by both public and private entities for the purpose fighting the COVID-19 crisis. On this point, see A Gergely and V Gulyas, “Orban uses crisis powers for detentions under fake news law” (Bloomberg.com, May 2020) <http://search.ebscohost.com/login.aspx?direct=true&db=bsu&AN=143195291&site=ehost-live> (last accessed 14 December 2020); see also The Guardian, “Hungary passes law that will let Orbán rule by decree” (30 March 2020) <https://www.theguardian.com/world/2020/mar/30/hungary-jail-for-coronavirus-misinformation-viktor-orban> (last accessed 14 December 2020). Moreover, Bulgaria’s government used its state of emergency decree to try to amend the criminal code and introduce prison sentences for spreading what it deems fake news about the outbreak with up to three years in prison or a fine of up to €5,000. Furthermore, another bill submitted to Parliament by a ruling coalition party on 19 April 2020 would, if approved, give the Bulgarian authorities more powers to suspend websites for spreading Internet misinformation beyond the immediate health emergency.
88 ibid, 4.
89 See, eg, G Nolte, “Hate-Speech, Fake-News, das ‘Netzwerkdurchsetzungsgesetz und Vielfaltsicherung durch Suchmaschinen’” [Hate speech, fake news, the “Network Enforcement Act” and assuring diversity through search engines] (2017) 61 Zeitschrift für Urheber- und Medienrecht ZUM 552–54. For the proposals of the parties, see, eg, the draft act submitted by the Green Party, BT-Drs. 19/5950 <http://dipbt.bundestag.de/dip21/btd/19/059/1905950.pdf>, archived at <http://perma.cc/FBW8-FJDP>.
90 See “Law against the Manipulation of Information”, which also introduced three new articles (L. 112, L. 163-1 and L. 163-2) to the French Electoral Code <https://www.legifrance.gouv.fr/loda/id/JORFTEXT000037847559/2020-12-15/> (last accessed 3 December 2020).
91 R Craufurd Smith, “Fake news, French law and democratic legitimacy: lessons for the United Kingdom?” (2019) 11(1) Journal of Media Law 52–81.
92 French government “Information Coronavirus” <https://www.gouvernement.fr/info-coronavirus> (last accessed 4 December 2020).
93 Libération, “‘Désinfox coronavirus’: l’Etat n’est pas l’arbitre de l’information” (3 May 2020) <https://www.liberation.fr/debats/2020/05/03/desinfox-coronavirus-l-etat-n-est-pas-l-arbitrede-l-information_1787221> (last accessed 4 December 2020).
94 On the academic debate see, more generally, Craufurd Smith, supra, note 91.
95 The National Assembly, Bill to fight against hateful content on the internet, 13 May 2020 <https://perma.cc/C7FD-J62S> (last accessed 5 December 2020).
96 FC Bremner, “French fake news law ‘will censor free speech’” (2018) The Times (London, England) 27.
97 Decision n. 2020-801 DC of June 18, 2020, Law to combat hate content on the internet <https://perma.cc/72VE-SMDJ> (last accessed 5 December 2020).
98 See Senate of the Republic, Bill No. 2688 of 7 February 2017 <http://www.senato.it/service/PDF/PDFServer/BGT/01006504.pdf> (last accessed 6 December 2020).
99 See D Kaye, “Mandate of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression” OL ITA 1/2018, 20 March 2018 <https://www.ohchr.org/Documents/Issues/Opinion/Legislation/OL-ITA-1-2018.pdf> (last accessed 7 December 2020).
100 See Senate Act No. 1900 – Establishment of a parliamentary commission of inquiry into the massive dissemination of false information <http://www.senato.it/leg/18/BGT/Schede/Ddliter/53197.htm> (last accessed 7 December 2020).
101 See E Apa and M Bianchini, “Parliament considers establishing an ad-hoc parliamentary committee of inquiry on the massive dissemination of fake news” (2020) 10 Iris 1–3 <https://merlin.obs.coe.int/article/9004>.
102 The Ministry of Health “Beware of Hoaxes” <http://www.salute.gov.it/portale/nuovocoronavirus/dettaglioContenutiNuovoCoronavirus.jsp?lingua=italiano&id=5387&area=nuovoCoronavirus&menu=vuoto> (last accessed 7 December 2020).
103 AGCOM, “Covid-19 for users” <https://www.agcom.it/covid-19-per-gli-utenti> (last accessed 7 December 2020).
104 See, eg, B Ponti, “The asymmetries of the monitoring unit to combat fake news on COVID-19” (La Costituzione.info, 7 April 2020) <https://www.lacostituzione.info/index.php/2020/04/07/le-asimmetrie-dellunita-di-monitoraggio-per-il-contrasto-alle-fake-news-sul-covid-19/#more-6961> (last accessed 7 December 2020).
105 European Commission, “Final report of the High Level Expert Group on Fake News and Online Disinformation” (12 March 2018) <https://ec.europa.eu/digital-single-market/en/news/final-report-high-level-expert-group-fake-news-and-online-disinformation> (last accessed 12 November 2020).
106 For academic debate, see, eg, J Muñoz-Machado Cañas, “Noticias falsas. confianza y configuración de la opinión pública en los tiempos de internet” [Fake news. confidence and configuration of public opinion in the times of the Internet] (2020) 86–87 El Cronista del estado social y democrático de derecho 122–39.
107 Order PCM/1030/2020, of 30 October, by which the Action procedure against disinformation was approved by the National Security Council <https://www.boe.es/boe/dias/2020/11/05/pdfs/BOE-A-2020-13663.pdf> (last accessed 13 December 2020).
108 El Paìs, “Spain to monitor online fake news and give a ‘political response’ to disinformation campaigns” (9 November 2020) <https://english.elpais.com/politics/2020-11-09/spain-to-monitor-online-fake-news-and-give-a-political-response-to-disinformation-campaigns.html> (last accessed 14 December 2020).
109 See R Radu, “Fighting the ‘infodemic’: legal responses to COVID-19 disinformation” (2020) 6(3) Social Media + Society 1–4 <https://doi.org/10.1177%2F2056305120948190>. For a non-legal discussion, see S Laato, AN Islam, MN Islam and E Whelan, “What drives unverified information sharing and cyberchondria during the COVID-19 pandemic?” (2020) 29 European Journal of Information Systems 288–305 <https://doi.org/10.1080/0960085X.2020.1770632>.
110 Baade, supra, note 6, 1358.
111 See, eg, the recent C Marsden, T Meyer and I Brown, “Platform values and democratic elections: how can the law regulate digital disinformation?” (2020) 36 The Computer Law & Security Report 1–18 <https://doi.org/10.1016/j.clsr.2019.105373>.
112 See, eg, E Howie, “Protecting the human right to freedom of expression in international law” (2018) 20(1) International Journal of Speech Language Pathology 12–15 <https://doi.org/10.1080/17549507.2018.1392612>.
113 See, eg, E Barendt, Freedom of Speech (2nd edn, Oxford, Oxford University Press 2005); see also K Greenawalt, Fighting Words: Individuals, Communities, and Liberties of Speech (Princeton, NJ, Princeton University Press 1995).
114 See Howie, supra, note 112, 13. See also Human Rights Committee, communication No. 1173/2003, Benhadj v. Algeria, Views adopted on 20 July 2007 <http://www.worldcourts.com/hrc/eng/decisions/2007.07.20_Benhadj_v_Algeria.htm>; No. 628/1995, Park v. Republic of Korea, views adopted on 5 July 1996 <http://www.worldcourts.com/hrc/eng/decisions/1996.07.05_Park_v_Republic_of_Korea.htm> (last accessed 8 January 2021).
115 See United Nations General Assembly, Universal Declaration of Human Rights (UDHR), 10 December 1948, Paris (General Assembly resolution 217 A) <https://www.un.org/en/universal-declaration-human-rights/> (last accessed 8 January 2021).
116 See Human Rights Committee, communication No. 1128/2002, Marques v. Angola, Views adopted on 29 March 2005 <http://hrlibrary.umn.edu/undocs/1128-2002.html> (last accessed 9 January 2021).
117 The ICCPR has been widely ratified throughout the world, with 168 States Parties. Notably, the following States have not signed the ICCPR: Myanmar (Burma), Malaysia, Oman, Qatar, Saudi Arabia, Singapore, South Sudan and the United Arab Emirates.
118 See, eg, M O’Flaherty, “Freedom of expression: Article 19 of the International Covenant on Civil and Political Rights and the Human Rights Committee’s General Comment No 34” (2012) 12(4) Human Rights Law Review 627–54 <https://doi.org/10.1093/hrlr/ngs030>.
119 It should be noted that Art 19 is also limited by another article, Art 20, which prohibits any propaganda of war or any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.
120 See the UN Human Rights Committee (HRC), CCPR General Comment No. 24: Issues Relating to Reservations Made upon Ratification or Accession to the Covenant or the Optional Protocols thereto, or in Relation to Declarations under Article 41 of the Covenant, 4 November 1994, CCPR/C/21/Rev.1/Add.6 <https://www.refworld.org/docid/453883fc11.html> (last accessed 10 January 2021).
121 UN Human Rights Committee 102nd session Geneva, 11–29 July 2011 General comment No. 34, Art 19: Freedoms of opinion and expression, at para 5 <https://www2.ohchr.org/english/bodies/hrc/docs/gc34.pdf> (last accessed 8 January 2021). General Comment No. 34 is a document adopted by the UN Human Rights Committee in July 2011 that gives States more specific guidance on the proper interpretation of Art 19 of the ICCPR.
122 J Farkas and J Schou, Post-Truth, Fake News and Democracy: Mapping the Politics of Falsehood (New York/London, Routledge 2020).
123 UN Human Rights Committee (HRC), CCPR General Comment No. 25: Article 25 (Participation in Public Affairs and the Right to Vote), The Right to Participate in Public Affairs, Voting Rights and the Right of Equal Access to Public Service, 12 July 1996, CCPR/C/21/Rev.1/Add.7 <https://www.refworld.org/docid/453883fc22.html> (last accessed 11 January 2021).
124 See Human Rights Committee, communication No. 1334/2004, Mavlonov and Sa’di v. Uzbekistan <http://www.worldcourts.com/hrc/eng/decisions/2009.03.19_Mavlonov_v_Uzbekistan.htm> (last accessed 11 January 2021).
125 See supra, note 121, at para 43 <https://www2.ohchr.org/english/bodies/hrc/docs/gc34.pdf> (last accessed 11 January 2021).
126 ibid.
127 International Covenant on Economic, Social and Cultural Rights adopted and opened for signature, ratification and accession by General Assembly Resolution 2200A (XXI) of 16 December 1966, entered into force on 3 January 1976, in accordance with Art 27 <https://www.ohchr.org/en/professionalinterest/pages/cescr.aspx> (last accessed 11 January 2021). The ICESCR has been signed and ratified by 163 States Parties.
128 Art 15(3) of the ICESCR: “The States Parties to the present Covenant undertake to respect the freedom indispensable for scientific research and creative activity”.
129 International Convention on the Elimination of All Forms of Racial Discrimination, adopted and opened for signature and ratification by General Assembly Resolution 2106 (XX) of 21 December 1965, entered into force on 4 January 1969, in accordance with Art 19 <https://www.ohchr.org/en/professionalinterest/pages/cerd.aspx> (last accessed 11 January 2021). The ICERD has been ratified by 177 States.
130 See Art 49 of the TEU <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:12012M049> (last accessed 12 January 2021).
131 The European Convention on Human Rights was opened for signature in Rome on 4 November 1950 and came into force in 1953 <https://www.echr.coe.int/Pages/home.aspx?p=basictexts&c> (last accessed 12 January 2021).
132 Yet we should consider some of the “duties and responsibilities” set forth in Art 10(2) of the ECHR. See the seminal work by D Voorhoof, “The European Convention on Human Rights: The Right to Freedom of Expression and Information restricted by Duties and Responsibilities in a Democratic Society (2015) 7(2) Human Rights 1–40.
133 The CFR was declared in 2000 and came into force in December 2009. For the official version of the Charter, see <https://eur-lex.europa.eu/eli/treaty/char_2012/oj> (last accessed 13 January 2021).
134 ECtHR, Lingens v. Austria, 8 July 1986; Şener v. Turkey, 18 July 2000; Thoma v. Luxembourg, 29 March 2001; Marônek v Slovakia, 19 April 2001; Dichand and Others v. Austria, 26 February 2002. See JF Flauss, “The European Court of Human Rights and the freedom of expression” (2009) 84(3) Indiana Law Journal 809. See also G Ilić, “Conception, standards and limitations of the right to freedom of expression with a special review on the practice of the European Court of Human Rights” (2018) 1 Godišnjak Fakulteta Bezbednosti 29–40.
135 On this point, see M Eliantonio, F Galli and M Schaper, “A balanced data protection in the EU: conflicts and possible solutions (2016) 23(3) Maastricht Journal of European and Comparative Law 391–403 <https://doi.org/10.1177/1023263X1602300301>.
136 Bédat v. Switzerland App no 56925/08 (ECtHR GC 29 March 2016) para 48 <http://hudoc.echr.coe.int/eng?i=001-161898>; see also Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland App no 931/13 (ECtHR GC 27 June 2017) <http://hudoc.echr.coe.int/eng?i=002-11555>.
137 Bédat v. Switzerland, supra, note 136, para 48.
138 See Morice v. France App no 29369/10 (ECtHR GC 23 April 2015) para 125; see also Sürek v. Turkey App no 26682/95 (ECtHR GC 8 July 1999) para 61; Lindon, Otchakovsky-Laurens and July v. France App no 21279/02 and 36448/02 (ECtHR GC 22 October 2007) para 46; Axel Springer AG v. Germany App no 39954/08 (ECtHR 7 February 2012) para 90.
139 See, mutatis mutandis, Roland Dumas v. France App no 34875/07 (ECtHR 15 July 2010) para 43; Gouveia Gomes Fernandes and Freitas e Costa v. Portugal App no 1529/08 (ECtHR29 March 2011) para 47.
140 See E.K. v. Turkey, App no 28496/95 (ECtHR 7 February 2002) paras 79 and 80.
141 See Thoma v. Luxembourg App no 38432/97 (ECtHR 29 March 2001) para 57.
142 See Paturel v. France App no 54968/00 (ECtHR 22 December 2005) para 42.
143 See MM Madra-Sawicka, JH Nord, J Paliszkiewicz and T-R Lee, “Digital media: empowerment and equality” (2020) 11(4) Information 225. The authors argue that empowerment is a process by which powerless people become conscious of their situation, organise collectively to improve it and access opportunities, as an outcome of which they take control over their own lives, gain skills and solve problems. See also N Kabeer, “Resources, agency, achievements: reflections on the measurement of women’s empowerment” (1999) 30 Development and Change 435–64. For this author, empowerment means expanding people’s ability to make strategic life choices, particularly in the context in which this ability had been denied to them. From another point of view, according to N Wallerstein and E Bernstein, “Empowerment education: Freire’s ideas adapted to health education” (1988) 15 Health Education and Behavior 379–94, empowerment is a process that supports the participation of people, organisations and communities in gaining control over their lives in their community and society.
144 Recently, eg, see EK Vraga et al, “Empowering users to respond to misinformation about Covid-19” (2020) 8(2) Media and Communication (Lisboa) 475–79. These authors claim that “if much of the misinformation circulating on social media is shared unwittingly, news and scientific literacy that helps people distinguish between good and bad information on Covid-19 could reduce the amount of misinformation shared”.
145 See, eg, AL Wintersieck, “Debating the truth: the impact of fact-checking during electoral debates” (2017) 45(2) American Politics Research 304–31.
146 For a list of fact-checking websites, cf. “Fake News & Misinformation: How to Spot and Verify” available at <https://guides.stlcc.edu/fakenews/factchecking>. For an extend analysis on the methodologies of three major fact-checking organisations in the USA, cf. <https://ballotpedia.org/The_methodologies_of_fact-checking>.
147 To take just one example, the French newspaper Le Monde has identified and corrected nineteen misleading statements made by Marine Le Pen, the right-wing candidate who reached the runoff of the 2017 French presidential election, during her televised debate against Emmanuel Macron; the article is available at <http://www.lemonde.fr/les-decodeurs/article/2017/05/03/des-intox-du-debat-entre-emmanuel-macron-et-marine-le-pen-verifiees_5121846_4355770.html> (last accessed 26 January 2021).
148 D Lazer et al, “The science of fake news: addressing fake news requires a multidisciplinary effort” (2018) 359(6380) Science 1094–96.
149 D Ariely, Predictably Irrational: The Hidden Forces that Shape our Decisions (New York, Harper Collins 2009).
150 See RS Nickerson, “Confirmation bias: a ubiquitous phenomenon in many guises” (1998) 2(2) Review of General Psychology 175–220.
151 Lazer et al, supra, note 148, 1095.
152 CR Sunstein, Republic.com 2.0 (Princeton, NJ, Princeton University Press 2009) pp 57–58. According to Sunstein’s point of view, the polarisation of a group is a phenomenon whereby individuals, after deliberation with likeminded individuals, are likely to adopt a more extreme position than the one they originally held. From another standpoint, Sunstein also considers fragmentation, arguing that the Internet’s ability to reinforce narrow interests encourages self-isolation, which, in turn, leads to group polarisation. For a discussion of fragmentation and the Internet, see CR Sunstein, “Deliberative trouble? Why groups go to extremes” (2000) 110 Yale Law Journal 71.
153 See, eg, RK Garrett, “Echo chambers online? Politically motivated selective exposure among Internet news users” (2009) 14 Journal of Computer-Mediated Communication 265–85. See also S Flaxman, S Goel and JM Rao, “Filter echo chambers, and online news consumption” (2016) 80(S1) Public Opinion Quarterly 298–320.
154 See Z Li, “Psychological empowerment on social media: who are the empowered users?” (2016) 42(1) Public Relations Review 50–52 and the review of the literature described therein. In academia, according to the leading scholarship, empowerment is a multi-level, open-ended construct that includes the individual level (see A Schneider, G Von Krogh and P Jäger, “What’s coming next? Epistemic curiosity and lurking behavior in online communities” (2013) 29(1) Computers in Human Behavior 293–303), the organisational level (see NA Peterson and MA Zimmerman, “Beyond the individual: toward a nomological network of organizational empowerment” (2004) 34(1–2) American Journal of Community Psychology 129–45) and the community level (see MA Zimmerman, “Empowerment theory: psychological, organizational and community levels of analysis” in J Rappaport and E Seidman (eds), Handbook of Community Psychology (New York, Plenum Press 2000) pp 43–63).
155 See European Commission, Report of the independent High Level Group on fake news and online disinformation, “A multi-dimensional approach to disinformation”. According to this Report, the platforms should implement the development of built-in tools/plug-ins and applications for browsers and smartphones to empower users to better control access to digital information. In particular, platforms should consider ways to encourage users’ control over the selection of the content to be displayed as results of a search and/or in news feeds. Such a system should give the user the opportunity to have content displayed according to quality signals. Moreover, content recommendation systems that expose different sources and different viewpoints around trending topics should be made available to users in online platforms. It should give users a certain degree of control.
156 However, in German case law, incentivising the visibility of government sources has been found to be illegal under competition law. Specifically, Google had favoured information from the German Ministry of Health by placing it “at the top” of its search engine results. In this regard, the Munich Regional Court (Landgericht München I) has provisionally prohibited a cooperation between the Federal Government and the Internet company Google on a health portal. The judges essentially granted two applications ad interim injunctions against the Federal Republic, represented by the Federal Ministry of Health, and the US company Google, as the Regional Court announced. The judgments are not legally binding. The Federal Government and Google will first examine the decision. For more details, see the German press release at <https://rsw.beck.de/aktuell/daily/meldung/detail/kooperation-zwischen-gesundheitsministerium-und-google-untersagt> (last accessed 10 September 2021).
157 The EU Code of Practice of Disinformation <https://ec.europa.eu/digital-single-market/en/code-practice-disinformation>.
158 See, eg, M Dando and J Kennedy, “Combating fake news: The EU Code of Practice on Disinformation” (2019) 30(2) Entertainment Law Review 44–42.
159 The European Commission published “First results of the EU Code of Practice against disinformation” (29 January 2019) available at <https://ec.europa.eu/digital-single-market/en/news/first-results-eu-code-practice-against-disinformation> (last accessed 7 March 2021). For more details, see RÓ Fathaigh, “European Union. European Commission: Reports on the Code of Practice on Disinformation” (2019) 3(3) Iris 1–5.
160 See I Plasilova et al, “Study for the assessment of the implementation of the Code of Practice on Disinformation Final Report” Publications Office, 2020, available at <https://op.europa.eu/en/publication-detail/-/publication/37112cb8-e80e-11ea-ad25-01aa75ed71a1/language-en>.
161 First baseline reports – Fighting COVID-19 Disinformation Monitoring Programme <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
162 Cf. Google COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
163 Cf. Mozilla COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
164 Cf. Microsoft–LinkedIn COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
165 Cf. Facebook COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
166 Cf. Twitter COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
167 Cf. TikTok COVID-19 report – August 2020 <https://digital-strategy.ec.europa.eu/en/library/first-baseline-reports-fighting-covid-19-disinformation-monitoring-programme> (last accessed 19 June 2021).
168 The Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, “Tackling Online disinformation: A European Approach” COM/2018/236 final <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52018DC0236>.
169 Arguably, from this perspective, it should be noted that German higher courts are beginning to issue the first rulings on fact checking by arguing that social media platforms, such as Facebook, must take into account the guarantee of freedom of expression when applying fact checking on users. Basically, this case law requires a higher threshold for deleting fake news. In this regard, see Der 6. Zivilsenat des Oberlandesgerichts Karlsruhe [The 6th Civil Senate of the Higher Regional Court of Karlsruhe], which is responsible, inter alia, for disputes concerning unfair competition. It issued an urgent decision on 27 May 2020 on the requirements for the presentation of a fact check on Facebook. In particular, the 6th Civil Senate granted the emergency application for an injunction against the defendant’s specific entry in the plaintiff’s post, which was based on an infringement of competition law, and amended the judgment of the Mannheim Regional Court, which had reached the opposite conclusion, accordingly. The decisive factor was that the concrete design of the test entry was, in the opinion of the Senate, misleading for the average Facebook user. Specifically, the linking of the entries on Facebook could be misunderstood to mean that the check and the objections referred to the plaintiff’s reporting, instead of – as was actually the case for the most part in the opinion of the Senate – to the “open letter”, which the plaintiff had only reported on. It should be noted that, according to the ruling of the 6th Civil Senate of the Higher Regional Court of Karlsruhe, the legality of fact checks on Facebook in general has not been decided in such proceedings. A press release on the fact-checking jurisdiction by the court is available online in German at <https://oberlandesgericht-karlsruhe.justiz-bw.de/pb/,Lde/6315824/?LISTPAGE=1149727> (last accessed on 10 September 2021). See also the German Federal Constitutional Court’s Order of 22 May 2019, 1 BvQ 42/19 on account deletion <https://www.bundesverfassungsgericht.de/SharedDocs/Entscheidungen/EN/2019/05/qk20190522_1bvq004219en.html> (last accessed on 30 August 2021). The 2019 Supreme Court decision, although mainly on a hate speech case, can also be considered similar for fact-checking cases, as it is on the effects of third parties on the right to freedom of expression for monopolists of online platforms.
170 For an in-depth analysis, see K Klonick, “The Facebook Oversight Board: creating an independent institution to adjudicate online free expression” (2020) 129(8) Yale Law Journal 2448–73.
171 ibid, 2464.
172 See E Douek, “Facebook’s ‘Oversight Board’: move fast with stable infrastructure and humility” (2019) 21(1) North Carolina Journal of Law and Technology 76, available at <https://scholarship.law.unc.edu/ncjolt/vol21/iss1/2>.
173 Klonick, supra, note 170, at 2499.
174 The approach that describes the media as a watchdog is beginning to be seen as insufficient. To this end, see the recent case law of the German constitutional court analysing the role of public service media as a counterweight to disinformation.
175 Recently, see I Katsirea, “‘Fake news’: reconsidering the value of untruthful expression in the face of regulatory uncertainty” (2019) 10(2) Journal of Media Law 159–88, which criticised “the restriction of ‘fake news’ unless if there was a pressing social need”.
176 Recently, to fight disinformation, there is also the new proposed Artificial Intelligence (AI) Act by the Commission with labelling rules for chatbots and deep fakes interacting with DSA (eg Art 52 of the AI Act). On this AI Act, see J Kalbhenn, “Designvorgaben für Chatbots, Deep fakes und Emotionserkennungssysteme: Der Vorschlag der Europäischen Kommission zu einer KI-VO als Erweiterung der medienrechtlichen Plattformregulierung [Design specifications for chatbots, deep fakes and emotion recognition systems: the European Commission’s proposal for an AI regulation as an extension of media law platform regulation] (2021) 8–9 ZUM – Zeitschrift für Urheber- und Medienrecht 663–74. See also M Vaele and BF Zuiderveen, “Demystifying the draft EU Artificial Intelligence Act” (2021) 4 Computer Law Review International 97–112.
177 Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, COM/2020/825 final, Brussels, 15 December 2020, 2020/0361(COD) <https://eur-lex.europa.eu/legal-content/en/TXT/?uri=COM:2020:825:FIN> (last accessed 20 June 2021).
178 EC Communication, “Shaping Europe’s Digital Future” (19 February 2020) <https://ec.europa.eu/info/publications/communication-shaping-europes-digital-future_it> (last accessed 20 June 2020).
179 In this regard, it should be mentioned that the DSA promotes empowering users when it states that recommender systems should offer a no-profiling option (eg Art 28 DSA).
180 Recently, see RK Helm and H Nasu, “Regulatory responses to ‘fake news’ and freedom of expression: normative and empirical evaluation” (2021) 21(2) Human Rights Law Review 302–28 <https://doi.org/10.1093/hrlr/ngaa060>. I do not doubt that there are well-founded reasons why the authors argue to implement criminal sanctions in order to address fake news. However, I am not persuaded by the authors’ argument when they affirm that “criminal sanctions as an effective regulatory response due to their deterrent effect, based on the conventional wisdom of criminal law, against the creation and distribution of such news in the first place” (cf. p 323, and more generally p 303, where it has been said that “this article identifies, albeit counter-intuitively, criminal sanction as an effective regulatory response”). Based on what I have argued in this article, the goal should not be to “worship freedom of expression” (p 326), but to protect it as a fundamental human right. Basically, the use of criminal sanction might undermine people’s rights more than is necessary to protect other interests, namely national security, public order and public health, and therefore should be avoided whenever other legal instruments that are less invasive than human rights can be used. Thus, in my opinion, the implementation of criminal sanctions ought to represent an “extrema ratio” for government policies precisely in order not to jeopardise fundamental human rights.
181 See R Greifeneder, ME Jaffé, EJ Newman and N Schwarz, The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (London, Routledge 2021), especially Part II, pp 73–90. See also Li, supra, note 154, at 51, Section 2.3, “Social media empowerment”. See also JR Anderson, Cognitive Science Series. The Architecture of Cognition (Hillsdale, NJ, Erlbaum 1983).
182 See CR Sunstein, Behavioral Science and Public Policy (Cambridge, Cambridge University Press 2020). See also the seminal book RH Thaler and CR Sunstein, Nudge: Improving Decisions about Health, Wealth and Happiness (London, Penguin 2009).
183 See R Maertens, J Roozenbeek, M Basol and S van der Linden, “Long-term effectiveness of inoculation against misinformation: three longitudinal experiments” (2021) 27(1) Journal of Experimental Psychology 1–16 <https://doi.org/10.1037/xap0000315>. See also S van der Linden and J Roozenbeek, “Psychological inoculation against fake news” in R Greifeneder, M Jaffé, EJ Newman and N Schwarz (eds), The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (London, Routledge 2020) <https://doi.org/10.4324/9780429295379-11>. In addition, see J Cook, S Lewandowsky and UKH Ecker, “Neutralizing misinformation through inoculation: exposing misleading argumentation techniques reduces their influence” (2017) 12 PLoS ONE e0175799 <https://doi.org/10.1371/journal.pone.0175799>.
184 Recently, with respect to reliability ratings in social media, see A Kim, P Moravec and AR Dennis, “Combating fake news on social media with source ratings: the effects of user and expert reputation ratings” (2019) 36(3) Journal of Management Information Systems 931–68, also available at SSRN <https://ssrn.com/abstract=3090355> or <https://doi.org/10.2139/ssrn.3090355>. The authors define and explain some reputation ratings, namely: (1) the expert rating, where expert fact checkers rate articles and these ratings are aggregated to provide an overall source rating related to new articles as they are published; (2) the user article rating, where users rate articles and these ratings are aggregated to provide the source rating on new articles; and (3) the user source rating, where users directly rate the sources without considering any specific articles from the source. On credibility evaluation online, see MJ Metzger, AJ Flanagin and RB Medders, “Social and heuristic approaches to credibility evaluation online” (2010) 60 Journal of Communication 413–39 <https://doi.org/10.1111/j.1460-2466.2010.01488.x>. On the same point, see also BK Kaye and TJ Johnson, “Strengthening the core: examining interactivity, credibility, and reliance as measures of social media use” (2016) 11(3) Electronic News 145–65 <https://doi.org/10.1177/1931243116672262>.
- 23
- Cited by