Hostname: page-component-586b7cd67f-g8jcs Total loading time: 0 Render date: 2024-11-25T09:07:25.411Z Has data issue: false hasContentIssue false

Exploring the Throughput Legitimacy of European Union Policy Evaluation: Challenges to Transparency and Inclusiveness in the European Commission’s Consultation Procedures and the Implications for Risk Regulation

Published online by Cambridge University Press:  03 May 2023

Paul Stephenson*
Affiliation:
Department of Political Science, Faculty of Arts and Social Sciences, Maastricht University, PO Box 616, 6200 MD Maastricht, The Netherlands
Rights & Permissions [Opens in a new window]

Abstract

In its evaluation cycle, the European Commission emphasises the importance of good data and the systematic involvement of a plurality of policy stakeholders, including citizens. Findings from European Union policy evaluation should inform further law-making, encourage learning and provide accountability. Transparent and inclusive formal procedures and tools are seen as essential for securing citizen participation in risk regulation; however, the Commission faces numerous challenges in securing engagement, particularly concerning the complexity of policy issues and the formal procedures for institutionalised consultations. Considering the Commission’s work from a proceduralist perspective, the article engages with Vivien Schmidt’s notion of “throughput legitimacy” to explore recent procedural innovations emerging since the Better Regulation agenda that have sought to enhance accountability, transparency, inclusiveness and openness, ensuring fairer and more balanced input on EU policy performance. The article argues in favour of greater throughput legitimacy in ex post policy evaluation but recognises challenges to the promotion of evaluation tools and their use by citizens.

Type
Articles
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited
Copyright
© The Author(s), 2023. Published by Cambridge University Press

I. Introduction

What is the value of examining transparency and the participation of citizens in formal processes at the evaluation stage of the policy cycle? Moreover, what can we realistically expect of citizen engagement in the European Commission’s consultation exercises as part of its own “evaluation cycle”? What does this mean for risk regulation? Furthermore, what are the implications of recent developments in Better Regulation for how risks are regulated, particularly when seeking stakeholder input during ex post evaluation?

Evaluation can be defined as “a critical, evidence-based judgement of whether an intervention has met the needs it aimed to satisfy and actually achieved its expected effects. It goes beyond an assessment of whether something happened or not, looking at causality – whether the action taken by a given party altered behaviours and led to the expected changes and/or any other unintended or unexpected changes.” Footnote 1 The Commission has sought to close the policy cycle; that is, to join up ex post evaluation with ex ante impact assessment to ensure that considerations of and findings on earlier policy performance are taken on board to improve future policy.

Transparency and communication are crucial if the citizen is to play an active role in shaping policy and holding the European Union (EU) to account. This is not necessarily easy given the need to explain complex issues related to policy performance in an effective way. In this sense, both the input and output legitimacy of EU policy depend on citizen engagement. Footnote 2 Effective citizen engagement and participation in evaluation exercises are possible only where transparency and access to information is ensured to the public, through both access to documents upon request and proactive publication of information. Footnote 3 Although openness and its corollaries of transparency and participation are foundational values of the EU, theoretical and practical challenges still persist and hamper the effective fulfilment of their potential. Footnote 4 Consultation mechanisms should not be captured by any particular group, otherwise participation is biased, as are the results, which subsequently become the inputs to further policymaking.

Transparency and participation go somewhat hand in hand, together increasing the legitimacy and acceptability of decision-making since they allow for the inclusion of a wider variety of values and perspectives. They favour compliance and the implementation of decisions that have been collectively discussed, but securing participation is easier said than done. Footnote 5 Is there an inherent trade-off between transparency and participation in evaluation linked to complex scientific areas of risk regulation? Unless “carefully balanced and built-for-purpose”, engagement mechanisms can become “conduits for de-legitimation”, which can be explained by three factors: contextual determinants, the institutional design of specific engagement arrangements, and organizational rationales and individual preferences.Footnote 6

If evaluations, and therein the constitutive consultations exercises, fail to capture citizen and civil society input, the risk is that the picture painted of a policy or programme in practice will be inaccurate or warped, representing the experience and opinion of only a limited set of stakeholders. This has important implications for participatory democracy and, moreover, for the input, throughput and output legitimacy of the EU. As Schmidt asserts, while output legitimacy is a “performance criterion encompassing policy effectiveness and outcomes”, throughput legitimacy is a “procedural criterion concerned with the quality of governance processes, as judged by the accountability of the policy makers and the transparency, inclusiveness and openness of governance processes”. Schmidt and Wood underline that “good policy results can offset a lack of citizen participation, or vice versa, … bad results matter little if citizens have approved the policy” (ie if policy has been approved through citizen participation). Footnote 7 For throughput legitimacy, however, high-quality governance processes cannot make up for low citizen participation; arguably, high-quality governance processes ensure the inclusiveness of civil society in institutionalised procedures – in this case, evaluation – such as through stakeholder consultations.

If evaluation exercises can be considered “throughput” (here, formal and institutionalised processes of data gathering through consultation), then any means of determining the value and effectiveness of the output depend on ensuring the input of citizens at the evaluation stage; it is citizens, after all, who experience the effects and impacts of policy and for whom policy is designed. With specific reference to risk regulation policy, let us recall Weimer, whose work on agricultural biotechnology argued that the EU’s exclusions of broader societal concerns related to environmental and social sustainability undermined the legitimacy and effectiveness of EU regulation in this area, but whose work also argued that resistance triggered legal innovations prompting us to rethink EU internal market law. Footnote 8 In short, for effective policy instruments (including legislation, but also including non-legislative and soft tools), we need comprehensive and pluralistic evaluations of previous methods and approaches.

Given the limited work on the impact of Better Regulation innovations on risk regulation, this article seeks to contribute to the literature by looking specifically at the implications for participation and transparency in Better Regulation and discussing the impacts. While in the EU law literature extensive attention has been paid to scientific uncertainty (eg from the perspective of judicial review or the governance model, Footnote 9 or the participation of experts in decision-making Footnote 10 ), due reflection on the role of transparency and the participation of civil society is still underdeveloped. Footnote 11 Stakeholder participation is an important tenet for EU policymaking, but the legal literature tends to refer to participation as “a formal or consultative opportunity in regulatory processes, resulting in rather homogeneous institutional arrangements for participation across policy fields and different sets of problems”. Footnote 12

This article thus explores procedural change in the Commission’s evolving approach to policy evaluation and citizen engagement. It considers institutional learning and adaptation as the result of practice, and in so doing identifies the challenges and risks of securing citizen input vis-à-vis industry stakeholders at the evaluation stage. The article asks: how has the European Commission sought to improve transparency and participation at the evaluation stage? What are the tensions and limitations, especially in the area of risk regulation? By extension, it considers the extent to which recent changes to managing policy evaluation have bolstered the “throughput legitimacy” of EU governance and policymaking. Footnote 13 It argues that procedural and institutional innovations have enhanced transparency and participation in EU ex post evaluation – thus reinforcing the throughput legitimacy of the Commission’s governance procedures – but it also argues that challenges remain, particularly regarding the inclusion of citizens and the quality of input in open consultation processes.

Section II provides a political context to – and rationale for – ex post policy evaluation in the EU. Section III discusses how to conceive of evaluation from a governance perspective and explores the notion of throughput legitimacy as a conceptual approach and tool for analysing change in the Commission’s evaluation practice. Section IV analyses recent procedural innovations to critically assess improvements to accountability, transparency and the inclusiveness/openness of processes for citizen engagement. Section V considers the implications of these changes for risk regulation, before Section VI concludes.

II. Background and context

1. Ex post policy evaluation and Better Regulation

As the last stage of the policy cycle, evaluation is intended to assess the effectiveness of policy and related legislation and to determine the value of what has been delivered. Evaluation is seen to have two main purposes: accountability and/or learning.Footnote 14 Accountability can be both political and administrative, holding to account decision-makers, policymakers and those engaged in implementation based on what policy has delivered, while learning can apply to a range of stakeholders – with programme users and desk officers more likely to learn from evaluation findings than political leaders.Footnote 15 “Responsive evaluation” is not only about assessing policy interventions, but is also concerned with formal engagement with all stakeholders about the value and meaning of evaluation practice. Footnote 16 It acknowledges the uneven character of relationships when it comes to inclusion and dialogue and requires all stakeholders to be given a fair chance to engage in evaluation processes. Challenges include asymmetric power relationships between institutions and citizens, the sensitivity (and, arguably, complexity) of issues and the strategic behaviour of certain stakeholders. Footnote 17

Facing the reluctance of the Member States to increase the size of the EU budget, under President Barroso’s “Europe of Results” agenda from 2006, the Commission has sought to demonstrate the effective policy delivery of EU policies and legislation, committing itself to “do more with less”. Footnote 18 This partly explains the greater commitment to ex post evaluation to gauge how policy has fared on the ground through better quantitative data, including using more case studies and “success stories” that can be communicated easily to the media and citizens. Arguably, the Commission has realised the need to become more self-reflexive and devote more resources to assessing the contributions of supranational policymaking to stakeholders at the regional and local level. But how responsive is the Commission’s evaluation practice?

When it comes to organising and conducting the evaluation of legislation, the Commission has evolved from taking an ad hoc and fragmented approach, to one that is more broadly harmonised and institutionalised. By the 2010s, the Commission was promoting evidence-based policymaking and transparency with regards to both processes and outputs within an “evaluation system” (ie where evaluation was now a permanent feature with practices institutionalised and findings actually used). Today, the executive talks in terms of openness in policymaking being linked to economic prosperity: “to foster Europe’s recovery, it is of key importance to legislate transparently and as efficiently as possible”. Footnote 19 As explored in Section IV, the Commission has made improvements in several ways Footnote 20 : removing obstacles and red tape by working more closely with local and regional stakeholders; simplifying public consultation by introducing a “call for evidence” in an online portal (“Have Your Say”); using the Better Regulation agenda to assist with the digital transformation; and improving strategic foresight to create future-proof policies in all sectors, with a particular focus on the green, digital, geopolitical and socioeconomic areas. Footnote 21

For too long, ex ante impact assessments were the basis for policymaking; ex post evaluations were often ignored because there was no obligation to conduct them and because they were not statutory parts of the legislative cycle. The Commission now evaluates the impacts of existing laws and policies to ensure that results feed into the design of new legislation. The 2021 Better Regulation communication Footnote 22 reasserts a commitment to evaluation and evidence-based policymaking, emphasising the role of the European Parliament (EP) and Council in this process:

A cornerstone of our better regulation approach is to learn from the past by evaluating existing legislation. Monitoring is crucial in the policy cycle and requires systematic collection of data. Monitoring and review clauses in legislation ensure that the necessary data is collected and evaluated. It is the joint responsibility of the co-legislators to see to it that these provisions are of high quality, so that the effectiveness of EU legislation in the Member States can be properly assessed.

2. Consultation and participation

Public consultation and participation have been explored by a host of scholars, centring on the Commission’s engagement with interest groups and various stakeholder groups and concerned with normative questions of regulatory legitimacy. Footnote 23 Research has explored whether the design of stakeholder consultations actually reinforces or alleviates bias in European governance and decision-making, Footnote 24 as well as what the Commission has done to try to counter the dominant influence of certain groups. It has also explored the nature of stakeholder engagement and citizen participation in rule-making, particularly in drawing up the Better Regulation Agenda, Footnote 25 and how EU advisory bodies and civil society organisations might play an enhanced role. Footnote 26 The European Court of Auditors has explored the effectiveness of consultation instruments and has criticised the Commission’s outreach. Footnote 27

Changes in stakeholder engagement practices raise key questions regarding the implications for regulatory legitimacy. Mechanisms meant to encourage engagement (including at the evaluation stage) are not by default legitimising.Footnote 28 When it comes to “citizen engagement”, we might define “citizen” as the EU citizen who does not have vested commercial or political interests in the policy or issue in question, while “engagement” refers to their mobilisation around common goals and the offering of meaningful opportunities to participate in change. There is a huge difference between “engagement” and “consultation”. The language of consultation is itself political – in trilogues on some technical files, when discussing the obligation to ensure stakeholder consultation, Member States have objected to the use of word “engagement” (which means two-way communication), while “consultation” implies just listening to stakeholders without the obligation of responding to what they say.

We can understand citizen engagement as instrumental to building deliberative democracy in the EU.Footnote 29 For example, the Water Framework Directive (2000/60) saw strong stakeholder engagement, with well over 250,000 responses being received during the open public consultation (OPC). Notte and Salles argue that such consultative proceedings could “empower citizens as part of water policy making and assessment and could weaken the corporatist and statist forms of management responsible for the deficient implementation of previous environmental laws”.Footnote 30 Two decades later, we have the high-profile example of the Conference on the Future of Europe Footnote 31 – an experiment in multilingual participatory democracy to supplement regular forms of representative democracy, with citizen input collected and policy priorities identified.

While the literature has explored the design and use of consultation instruments, there has been no systematic exploration of how the Commission processes the “conflicting input from various stakeholders”. Footnote 32 Drawing on the literature, Radulova et al assert that while online consultations have managed to attract “a larger and more varied pool of participants”, in practice they are not truly inclusive, instead being dominated by older Member States and industry and business representatives. In complex areas of risk regulation such as genetically modified organisms (GMOs) Footnote 33 and EU chemicals policy, Footnote 34 it has been difficult to engage citizens. Dąbrowska noted that, despite a shift towards more participatory policymaking and a preference for institutionalised forms of cooperation with civil society, there was a lack of methods in place to evaluate citizen involvement in GMO approvals.

III. Analytical framework: throughput legitimacy in theory and practice

Evaluation is an institutionalised practice meant to determine the output (performance) legitimacy of a course of policy action. It is difficult, however, to separate output legitimacy from questions regarding throughput (procedural) legitimacy. Footnote 35 The complex processes of good governance have been analysed by European studies scholars in recent years, who have explored the legitimacy and legitimation of international organisations including the EU. Footnote 36 The concept of throughput legitimacy has been applied in research on the EU administration, democratic processes and institutions such as the Commission and European Parliament. As Schmidt and Wood assert, it can be considered an “umbrella concept” for appraising the “legitimacy of complex processes and procedures occurring within the ‘black box’ of multi-level governance”.Footnote 37 Throughput legitimacy can thus be defined as

the myriad ways in which the policy-making processes work both institutionally and constructively to ensure the efficacy of [multi-level] governance, the accountability of those engaged in making the decisions, the transparency of the information and the inclusiveness and openness to “civil society”.Footnote 38

How are the three types of legitimacy related and how might they themselves be explored? Schmidt and Wood Footnote 39 question these normative foundations and place a focus on “procedure”. Indeed, “proceduralism” can be defined as a belief in the overriding importance of explicitly codified procedures in government, public administration and jurisprudence. Footnote 40 Related conceptions of legitimacy emphasise the need to closely follow those procedures:

In the proceduralist perspective, it is not the substantive content of decisions that renders them legitimate but how they are made. Therefore, when discussing the quality of government and public administration, proceduralists tend to emphasize process features such as participation, deliberation, transparency and accountability of decision-makers. In the field of legal theory, proceduralists such as Jeremy Waldron highlight the procedural elements of law rather than its substantive “virtues”. Footnote 41

In keeping with this, Schmidt and Wood seek to “operationalise” the concept of throughput legitimacy by conceiving of the term through a series of normative building blocks of accountability, transparency, inclusiveness and openness, thereby providing a framework that can be used to analyse institutional reforms but may also be useful for analysing risk regulation.

First, accountability means an actor or institution providing information needed to properly discuss and deliberate over a policy in action and sanctions being in place to penalise misconduct. Footnote 42 Accountability can be considered an “evaluative political or administrative mechanism”, including “a willingness to act in a transparent, fair, compliant and equitable way”.Footnote 43 However, in supranational governance, accountability forums mainly consist of experts who have the knowledge to assess policy (output) performance; legitimacy is derived through the building of networks of technical actors, although civil society actors are often excluded. The “real world” of accountability involves technical accountability processes that result in data collection and evidence gathering on performance and the production of reports, thus feeding into political accountability processes. The paradox is that the Commission’s increasing attention to internal accountability has not necessarily done much to improve visibility or help it connect psychologically with the public.Footnote 44

Second, while closely related to accountability, transparency means citizens and political representatives have access to information about governance processes, as well as access to the resulting outputs such as decisions or findings. In the EU, this means that institutions make available information about their internal processes. This might refer to the production and publication of documents concerning effectiveness, or even risk, enabling citizens and the media to glean insights into policy implementation and performance. It also refers to the criteria used in internal processes “to structure the decision-making procedures”. Often, increased access to information can lead to less transparency “because citizens find it difficult to navigate the mountains of data available via the internet”. Footnote 45

Third, as Schmidt asserts, inclusiveness and openness can be found in “the intermediation processes through which citizens organised in interest groups have a direct influence on policy making”.Footnote 46 Openness here implies that the public has access to policymakers regarding EU policies with which they are concerned. Practically speaking, this can be considered as the number and type of opportunities individuals and non-state actors have at their disposal to become involved in consultation. Inclusiveness means policymakers being open to all groups in order to ensure balanced representation. It concerns breadth of representation and refers to “procedural inclusiveness”, most often in policy shaping and implementation, but arguably also – and for the purposes of this article – in evaluation.Footnote 47 The danger facing administrative bodies such as the Commission is that particular interests (those with the most resources to mobilise and exert their preferences) “capture” the process and exert more influence.

All of the above normative concerns have implications for participatory and deliberative democracy. Footnote 48 In short, there is no output legitimacy without deliberation over those outputs. That is to say, we can only arrive at a fair, balanced and representative impression of whether the outputs are legitimate (ie how policy performed and its value) based on effectiveness, efficiency and economy if there is fair, balanced and representative deliberation over the said policy – which is to say, if the input provided from those affected by the policy reflects a broad range of stakeholders, including citizens and civil society groups. The procedures and mechanisms in place to capture this feedback on policy implementation must be robust and demonstrate processes of good governance. By contrast, limited or biased inputs from a more limited set of stakeholders will lead to a different analysis of how policy performed, and thus a different conclusion as to the output legitimacy of the policy. In short, output legitimacy is dependent on – and a “deliberative construct” of – throughput legitimacy.

How are we to measure throughput legitimacy in risk regulation? Coming up with a precise set of indicators is difficult, but one might analyse the types of participatory decision-making activities and strategies to support effective collaboration between stakeholders.Footnote 49 There are nonetheless criticisms of “throughput legitimacy” as a term: Steffek Footnote 50 argues that it needs to be seen in the context of an increasing proceduralism in political science and public administration: “Throughput legitimacy attracted so much attention because it is the perfect normative companion to the analytical concept of governance. Governance is procedure, and throughput legitimacy tells us what good procedures are.” Footnote 51 Steffek analyses the analytical value of the concept as well as its normative and practical implications, arguing that it may enrich existing typologies of legitimacy but at the same time suffers from “fuzzy borders”. He asserts that “politically, throughput legitimacy lends itself to apologetic uses when it is applied as a tailor-made normative standard for technocratic, non-majoritarian institutions.”

The remainder of this article seeks to employ this framework as a means to analyse recent developments in the Commission’s evaluation cycle, with an emphasis on open public stakeholder consultation, to determine the extent of advances made in transparency and participation. It considers recent developments, improvements and lessons from the evaluation cycle according to Schmidt and Wood’s four features, with a focus on Commission tools and procedures. It does not employ strict indicators but pays heeds to the approach taken by Petropoulou and Eliantonio in this special issue, who consider transparency in practical terms of document accessibility and participation in terms of the openness and inclusiveness of consultation.

IV. Analysing throughput legitimacy: procedural developments in the European Commission’s evaluation practice

1. How accountable is the policy evaluation process to citizens?

a. “Evaluate First”

While “smart regulation” was about the whole policy cycle, from the design of a piece of legislation to its implementation, enforcement, evaluation and revision, the Commission has recognised the need to build on the strengths of the impact assessment system for new legislation. With the “Evaluate First” principle has come a legal commitment to evaluate, but also a rationale to do so in order to make policy more efficient and effective, systematically ensuring that all significant proposals are backed up by a robust evaluation of existing EU action; evaluation is included as a core element of all processes such as programming and policy formulation.

The Commission has reinforced its focus on learning by joining up the stages of the policy cycle so that lessons from policy analysis and evaluation feed into policy reformulation. There should be no new legislation – or legislative proposals – until an evaluation of previous legislation has been conducted and the findings considered and used to inform the drafting of a new legal act.Footnote 52 Properly assessing how policy has fared in practice is deemed politically essential. The institutionalisation of an “evaluation cycle” means that there are now several moments foreseen for stakeholder consultation (lasting four to eight weeks). Moreover, the Regulatory Scrutiny Board has extensive experience in overseeing impact assessments, but now has competences in legislative evaluation. Subsequently, the quality of evaluation reports and new legislative proposals is systematically – though not exhaustively – monitored.

b. Introduction of Staff Working Documents

Rather than the Commission tendering out an entire policy evaluation – ex post (retrospective), ex ante (prospective) or interim (ongoing) – it may now only tender out part of it, conducting some evaluation work in house. It is often well positioned to explore questions regarding certain criteria itself (eg “relevance” and “EU added value”). The standard use of Staff Working Documents (SWDs) is arguably a key innovation, incorporating the outsourced ex post evaluation within the final evaluation product. Thus, the SWD is the Commission evaluation, usually based on different sources, including support from external contracts. Taking greater control of the process, the Commission also makes strategic use of evaluation findings.

However, one of the challenges of participation are the barriers to entry for new evaluators with limited EU experience; that is to say, often the same private-sector consultancies (“the usual suspects”) bid for and win the evaluation tenders. The Commission’s use of various types of framework contract helps to reduce bureaucracy (through multiple calls for tender) and to speed up delivery. Second, and linked to the first issue, the quality of what is delivered has often varied greatly, with the Commission often paying for substandard work and with evaluators taking tried-and-tested approaches to evaluation design without necessarily innovating. Tenders for evaluation studies from all EU institutions are advertised on the official website. Footnote 53 This website also includes tenders for other forms of research: recent examples in the field of risk regulation include a tender from the Joint Research Centre (Seville) on Mapping the Risks and Vulnerabilities in the EU Food Supply Chain Footnote 54 and the European Innovation Council and SMEs Executive Agency (EISMEA) on research around “Preventing cyber-theft of trade secrets: awareness toolkit for SMEs”. Footnote 55

Commissioning evaluation is expensive: each question asked comes at a considerable cost given the research and data collection required. The Commission must be sure to ask the right questions. In most cases, the terms of reference (ToR) contain questions for each of the core evaluation criteria. The identification of questions is the task of the Commission’s Interservice Steering Group (ISG), and the explicit use of questions arguably enhances transparency. Without such preliminary work, it is difficult to ensure clear, robust procurement. Commission services know that leaving the identification of questions open to contractors brings uncertainty; the evaluation risks not focusing on the key issues to be addressed. However, the ToR often requires the contractor to develop sub-questions that operationalise the core questions. For example, a core question on efficient policy delivery might be broken down into sub-questions concerning costs, benefits, differences across the Member States and explanations for variation. The tender often asks for an evaluation matrix including research questions, evaluation criteria, indicators and data sources. Evaluators are often asked to follow the structure of the SWD in their final reports so that much of the final report can be copied into the relevant sections of the Commission’s official document.Footnote 56

What about the transparency of the outsourcing? Ideally, SWDs should make explicit reference to the outsourced ex post evaluation that informed it – the Commission may draw on the work, using some but not necessarily all of it. Moreover, since the external evaluation is paid for with public money, it should also be stored and made retrievable alongside the accompanying SWD so that we can see the original work and determine which parts draw on Commission work and which on external evaluation. The evaluation is always published by the EU Publications Office. Footnote 57 The link between an SWD and an external report might not be obvious, but the SWD should always refer to the external report in such a way that it can be identified and traced back.

Prior to publication, draft evaluation reports from private contractors (submitted to the Commission) are works in progress, so they are not subject to any public access to documents regulation; that is, the citizen cannot request the text from the Commission – draft reports may be sensitive and, in any case, this would be too much to expect of transparency. Indeed, final evaluation reports are published and are public documents; they rarely remain unpublished unless there is a particular sensitivity to the subject matter. Mostly often, evaluation reports can be retrieved from the contracting Directorate-General’s (DG) website and the EU Publication Office website. Footnote 58 However, the question of ownership of the SWD is clear – it is officially a Commission document – but it always contains an annex on process and on methodology. Data and methods are key sections in ex post evaluations (and their annexes), but in SWDs this process-orientated information is not always provided. Instead, there tends to be a focus on findings and thus potentially a loss of transparency.

Even in those policy areas where there is a greater potential for citizen participation, methodologically speaking it is expensive procedurally to engage stakeholders through focus groups and interviews. While surveys and questionnaires might be less costly they often lead to low rates of participation. To speed up the process and give due guidance to service providers, many evaluation tenders published by the Commission are prescriptive in providing a list of relevant policy stakeholders to be contacted during evaluation research. They can also indicate the questions to be asked, although this is not systematic and varies across DGs.

The Commission regularly asks for a thorough stakeholder mapping to be carried out as part of the inception phase of the evaluation. This is something that is documented in a consultation strategy (the mapping justifies the selection of consultation tools and which questions are asked from which stakeholder group). Stakeholders might include known policymakers, citizens and civil society groups, although the evaluator may have some room for manoeuvre to identify other interviewees; however, the evaluator must seek the Commission’s approval – usually as part of the inception report that presents the final approach and methodology to conduct an evaluation. Only when the Commission signs off on the inception report delivered by the evaluator contracted can it proceed with data gathering and analysis.

2. How transparent are policy evaluation procedures for citizens?

a. Harmonisation of evaluation practice and criteria

The Commission’s efforts to improve legislative quality and transparency began with the Smart Agenda in 2005. Footnote 59 By 2015, the Commission’s Better Regulation policy had developed into a set of tools and guidelines intended to guide, improve, control and harmonise the process of EU policymaking. Better Regulation Footnote 60 created a narrative for good governance and open, inclusive and pluralistic policymaking; implicitly, influence is possible for all stakeholders and citizens. It emphasised the importance of good data, and therein the systematic involvement of a plurality of policy stakeholders to ensure quality input in order to capture the experience of policy on the ground.

The Commission’s Secretariat General has steered the approach to evaluation, now considered to be based on five criteria (effectiveness, efficiency, relevance, coherence and EU added value) and understood as “an evidence-based judgement of the extent to which an existing intervention is effective, efficient, relevant given the current needs, coherent both internally and with other EU intervention, and has achieved EU added value”. Footnote 61 Using the same criteria across policy domains has helped make evaluation more transparent and clearer for evaluating parties, decision-makers, MEPs and citizens.

Previously, “effectiveness” and “efficiency” commonly featured as core evaluation criteria, but across DGs the evaluation design differed. The analysis of the “relevance” of the intervention now stresses future needs that may arise from upcoming changes (technological, social, environmental and economic) or those identified in any strategic foresight analysis. An evaluation should refer to the contribution of policy to the relevant UN Strategic Development Goals and use their datasets and indicators to assess the performance of the intervention. It should also assess whether the intervention is coherent with the objectives of the European Green Deal and other policies targeting the environment. Footnote 62

Increasing transparency and promoting shared ownership among Commission services have been central to the Better Regulation agenda. Evaluation needs to be open with regards to its design, logic of intervention and the transparency of evidence collected that will be used to inform any judgment, all of which should be properly described in the evaluation. The evaluation tenders usually come from the thematic DGs. Each has an evaluation/Better Regulation function dealing with the evaluation requirements. The Secretariat-General of the Commission now takes a leading role in this process, offering support, guidance and training to the DGs, who in turn plan and manage the evaluations and implement the EU’s evaluation policy; thereafter, the Secretariat-General provides feedback to the DGs on their draft evaluations (SWDs). The setting up of an Interservice Group (ISG) immediately after a legislative initiative is validated, providing an opportunity for Commission services to influence decision-making as early as the design phase of the initiative. Footnote 63

b. From “roadmaps” to “calls for evidence”

The Commission has recently changed the documents for consulting the public. Footnote 64 “Roadmaps” (looking ahead and introducing a calendar for the evaluation cycle) and “inception impact assessments” are no longer produced. Instead, all details of the upcoming initiative and its accompanying public consultation are set out in a “call for evidence”. An example from summer 2022 (public consultation in the third quarter of 2022) came with the view to proposing a regulation to establish rules on the marketing and use of high-risk chemicals in order to increase security in the EU by reducing the risk of dangerous chemicals being acquired by terrorists or other criminals to carry out attacks. Footnote 65 More transparency about future Commission evaluation means stakeholders have more foresight and greater opportunity to feed into the evaluation process. However, citizens and stakeholders across all policy areas remain more interested in ex ante impact assessment than ex post evaluation.

On the Have Your Say website,Footnote 66 former roadmaps and inception impact assessments can now be found by sorting the relevant “stage” (the drop-down list distinguishes between in preparation, call for evidence, public consultation, draft act, Commission adoption, suspended/abandoned) and the relevant “document category” (the drop-down list distinguishes between: call for evidence for an evaluation, call for evidence for an evaluation and impact assessment, call for evidence for an impact assessment, evaluation, opinion on evaluation, opinion on fitness check, opinion on impact assessment, etc.). The aim is to provide more flexibility for Commission services when organising their consultation activities and to reduce consultation fatigue, as revealed in a recent stocktaking exercise. Footnote 67 Recent calls for evidence concerning risk regulation in 2022 include those on cancer screening for high-risk groups, Footnote 68 on ship recycling facilities Footnote 69 and on new cybersecurity rules for digital products and ancillary services. Footnote 70

There have been gradual improvements since the introduction of “Have Your Say”, with the launch of a single entry point in 2017 deemed a “significant milestone”. Footnote 71 July 2020 saw the launch of a newer version with a more intuitive user experience and a markedly improved search function. In line with its Strategy for the Rights of Persons with Disabilities, the Commission has been making the portal more accessible to people with disabilities. Footnote 72 As the Commission asserts, accessibility to virtual environments and to information and communication technologies is an enabler of rights and a prerequisite for the full participation of persons with disabilities on an equal basis with others. Article 10 TFEU underlines that the Union should combat discrimination, including that based on disability, when defining and implementing its policies. Effective policymaking implies “consultation and participation of persons with disabilities and their representative organisations throughout the process and the provision of information about relevant policy initiatives and consultations in accessible formats”. Footnote 73

c. Knowledge storage, data access and evaluation management

Transparency and participation depend on easy access to documents. For draft legislative proposals this would seem to be well covered by the EP’s legislative train website, Footnote 74 while for access to SWDs and ex post evaluations outsourced to third parties this is less straightforward. While there is a Transparency Register Footnote 75 – a centralised portal for collecting input, positions and preferences relevant to the policy formulation stage of the policy cycle – there is to date no centralised “Evaluations Register” (ie a single repository where all publicly funded evaluations can be easily found). Rather, the Commission’s documents and evaluations can be found online via a portal for all general EU publications. Footnote 76 Better knowledge management would arguably ensure greater transparency and encourage citizen engagement in risk regulation.

Finding evaluations already requires a citizen to be informed on how to navigate EU websites. Traditionally, citizens (and professional evaluators) have used the search function of the EU Publication Office website or the section on “evaluations and studies” on individual DG websites. “Have Your Say” is now used to store all planned and completed evaluations, although this has not necessarily been implemented in a systematic way, hindering visibility and limiting readership. The search function is rudimentary, which contrasts with the ease with which special reports (performance audits) of the European Court of Auditors can be accessed and downloaded. Footnote 77 All publications (including evaluations) can at least be filtered according to six themes: Environment, Food and Natural Resources; EU in the World; Functioning of the EU; Health, Wellbeing and Consumer Protection; Infrastructure, Research and Innovation; and Media, Culture and Languages in the EU. Arguably, when an evaluation is stored, if the Commission wishes not to attract too much attention, it can be placed in a way that requires “drilling down” through several pages in order to locate the original document. Indeed, the 2021 communication on the Better Regulation Agenda called for action in this area. Footnote 78

Another issue at stake when it comes to transparency and the participation of the private sector (commercial stakeholders or natural and legal persons) is that it is difficult for citizens to get an overview of which evaluating bodies (consultants and consortia) are being – often continually – awarded ex post evaluation contracts. This is a delicate issue given the confidential nature of commercial contracts and revenues. Nonetheless, DG Budget manages a website known as the Financial Transparency System, where framework contracts and other awards are listed with information about financial beneficiaries. Footnote 79 Moreover, the Commission’s “Funding and Tender Opportunities Portal” brings together information on all funding opportunities. Footnote 80 Informed citizens can look up evaluations on the EU Publication Office website, where authorship information is listed – not only the name of the consortium that conducted an evaluation, but also usually a list of consortium members. One can see on the websites of the various Commission DGs that each favours certain companies over others, and indeed some may focus on specialised evaluation areas such as transport or environment; their evaluations are often accessible. There are nonetheless twenty to thirty regular consultancies specialised in EU evaluation as a part or core aspect of their work. Some highly reputable international evaluation firms such as Ecorys, Ramboll, Tetra Tech and Technopolis mainly perform EU policy evaluations, while management consultancies and accountancy firms such as Ernst & Young and Deloitte also bid to perform evaluation work alongside tax accounting. It is important to acknowledge that EU-level contracts require firms who largely operate in all twenty-seven EU Member States, can work in all official EU languages and understand the national legal frameworks; therefore, the system tends inherently to favour the “big players”.

3. How inclusive and open is the policy evaluation process to citizens?

a. Open public consultation

Within the evaluation process, the Commission has regularly had three main tools at its disposal: OPCs, targeted consultations and consultation for feedback on roadmaps. There is a difference in the number of consultations carried out by various DGs, which can be explained by the policy area and the Commission’s right to act/legislate. DGs such as Employment (EMPL), European Neighbourhood and Enlargement (NEAR) or Education, Youth, Sport and Culture (EAC) manage a very limited acquis (and right to act beyond the Open Method of Coordination) and focus primarily on spending measures. Servicing DGs like Economic and Financial Affairs (ECFIN) will run fewer OPCs than DGs with a heavy legislative agenda. This variation in number and frequency affects the potential for public participation across policy fields. Nonetheless, outside of all of the formal consultation streams, the Commission also receives input through numerous and various channels including working groups, formal/informal events, voluntary submissions, court cases and infringements, among others.

When the Commission launches an open consultation as part of the evaluation cycle, it announces the evaluation standards, the original legal basis of the policy intervention set to be evaluated and the intervention logic of the course of action and makes clear the dates for consultation. It does this through the dedicated “Have Your Say” portal, where citizens and businesses are invited to share their views on policies and existing laws. Footnote 81 Thus, the Commission seeks views on laws and policies currently in development, after which it may decide to modify or abandon the initiative. It makes the sharing of feedback easy by encouraging respondents to answer a questionnaire or comment on a legal draft. Besides business interest groups and technical experts, citizens can contribute in any one of the twenty-four official EU languages, signing in using an EU login or social media account. Citizen feedback is published instantly and thus must adhere to feedback rules. This is not the same, however, as responses to OPCs, which are not published straight away (ie the processes behind the consultation mechanisms differ).

The Commission has committed to protecting personal data, analysing and summing up the feedback and contributions received and publishing reports under some initiatives. It seeks to provide transparency regarding the opinions collected and how they should help fine-tune the initiatives. Organisations are encouraged to first register in the EU Transparency Register before sending contributions. As of 10 May 2022, there were some 2,445 legislative initiatives listed on the website, with four stages of the cycle evident in the types of consultation depending on the legislative train: “call for evidence: open”, “public consultation: open”, “draft act: open” and “Commission adoption: open”, each clearly indicating a time period for opinion gathering, the duration of which differs depending on the type of consultation. Multiple channels of stakeholder consultation ensure that all types of stakeholders, from those closely linked to policy and well informed, to those more distant from policymaking, have a chance to participate. Footnote 82 Using multiple consultation mechanisms enables triangulation, ensuring the scientific rigour of evaluation.

b. The procedural limits of public consultation

Traditionally, there have been low response rates to public consultation, with only few examples of citizens engaging with the exercise in great numbers. The Commission has critically considered whether utilising OPCs has been an efficient use of financial and human resources: OPCs usually require developing a questionnaire, which needs to be translated into all twenty-four official EU languages, to which replies, also potentially in all such languages, must be uploaded and then analysed. Recognising their costly nature and the fact that evaluations often attract little attention, the Commission, in the latest changes to the Better Regulation Agenda, has now removed the obligation for evaluations to involve an OPC.

An example from the field of risk regulation with low turnout but constructive citizen engagement was the public consultation on the transparency and sustainability of EU risk assessment in the food chain in 2018. The political, economic and societal context of food security had evolved, affecting consumers’ perceptions and expectations. The Commission thus sought citizen views and experiences on: the transparency and independence of the EU risk assessment system with respect to the underlying industry studies and information on which European Food Safety Authority’s (EFSA) risk assessment/scientific advice is based; risk communication; and the governance of EFSA, in particular the involvement of Member States in the EU risk assessment system.

Despite seemingly successful consultation exercises, responses are not always necessarily representative of the opinions of the broader range of stakeholders. Footnote 83 OPCs are often used by lobbyists as a further moment in the policy cycle to express their interests (“dump their positions”) on the issue or sell preferred policy solutions to shape future policy rather than to engage in questions related to past policy performance. Often, the information given does not answer the questions asked in the consultation. In practice, this can mean huge amounts of text are pasted into the consultation portal, which then creates a significant technocratic burden for the Commission. Although responses are acknowledged through an automated system, they are not responded to individually; the responses are analysed and their use is reported on in a synopsis report integrated into the evaluations and impact assessments.

To an extent, openness brings costs, since all text needs to be processed; ideas and opinions might have little policy value or be biased. The Commission can only treat the input as “ideas” and not consider it as serious data or evidence. Indeed, it has reasserted its commitment to involve a wider range of EU citizens in policymaking, publicising public consultations to attract more participants and higher-quality contributions. It plans to work more closely with the Committee of Regions, the European Economic and Social Committee, national authorities, social partners and other representative associations in order to raise awareness of the opportunities to contribute to the Commission’s policymaking, even asking them to help disseminate “calls for evidence” at national and regional levels. The Commission’s Representations in the Member States and the EU Delegations are also expected to support such efforts.

V. The implications of the Commission’s procedural innovations for participation in risk regulation evaluation

The Organisation for Economic Co-operation and Development (OECD) rated the Commission as a top performer on consultation in 2018 and 2021 among a range of global international organisations. Footnote 84 In fact, the EU overall scores higher than any individual OECD country in the composite indicators of stakeholder engagement and ex post evaluation, and it also performs in the top five in terms of regulatory impact assessments. Footnote 85 It would appear to be at the vanguard when it comes to consultation practice. Given improved tools for participation, one might argue, therefore, in favour of increased input legitimacy to the policymaking process – in this case, the process fosters both learning and accountability: the input (stakeholder and citizen opinions and feedback) is focused on the output (policy evaluation) in order to shape the subsequent input (draft legislative proposals).

The Commission has arguably learnt from past experience. The 2021 communication on Better Regulation recognised that contributing to public consultations requires time and resources from those participating. Footnote 86 However, public consultations are often still seen as the only instrument for collecting feedback from the general public (as opposed to targeted instruments for stakeholder groups). It has committed to facilitating the input from stakeholders as much as possible to make sure that the public is only consulted when necessary – although how are we to determine this? The Commission now consults the public only once when revising existing legislation and evaluating spending programmes at mid-term instead of having separate consultations for the evaluation and the impact assessment. It avoids public consultations on very technical issues of little interest to the general public, where a targeted consultation of stakeholders is arguably a better means of collecting the necessary evidence. Footnote 87

The Better Regulation stocktaking exercise in 2019 showed that public consultation questionnaires were too long and too technical and did not strike the right balance between open questions allowing for substantive replies and closed questions requiring yes/no answers. Footnote 88 The Commission has committed to improving the structure, content and language of these questionnaires. According to the 2019 stocktaking exercise, “nearly 40% of the respondents to the public consultation were (very) dissatisfied with the way the Commission reports on the result of its public consultations and feedback, and what it does with this information”. Footnote 89

The REFIT Platform Footnote 90 also asked for more transparency in the feedback provided, a request that was supported by the European Court of Auditors. The auditors recommended the Commission improve the way it reacts to the evidence it has collected through its consultation activities. Footnote 91 Overall, in its examination of the way the Commission prepared and conducted a selection of public consultations and how it made use of the consultation work, the European Court of Auditors found public consultation to be of a high standard but indicated that improvements could be made in terms of outreach activities. However, it recognised weaknesses in data processing and shortcomings in data analysis. Footnote 92 More and more, we see the burden shifting from the Commission to the contractors conducting the evaluation in the framework of which the public consultation takes place. Increasingly, evaluators are asked to include a separate section analysing the contributions to the public consultation in the synopsis report – a section that the Commission can then use to show how the contributions have been processed and used (or not). It must publish the synopsis report as an annex to evaluations and impact assessments.

The Commission has realised the need to boost awareness around “Have Your Say” and to encourage more people, including citizens without in-depth knowledge of EU policymaking and of relevant scientific disciplines, to also contribute to “calls for evidence” (the replacement for roadmaps), and this would seem to have implications for risk regulation by encouraging non-expert input. Footnote 93 Open calls in September 2022 included several issues relevant to risk regulation: energy labelling on mobile phones and environmental impacts; measures to quantify transport emissions; train driver certification schemes; and amending legislation to include new precursors used for illicit drug manufacturing. How can we expect to obtain meaningful evidence-based input from citizens when most will have little knowledge of such highly technical issues?

Nonetheless, the Commission has sought to better engage the scientific research community, encouraging it to submit relevant scientific research at the beginning of the process. It has also looked to the Conference on the Future of Europe as “an excellent opportunity to debate with citizens how to address Europe’s challenges and priorities. The Conference’s online deliberative platform is a new approach to engage with people on issues that they care about.” Footnote 94

The reality of open consultation raises questions about costs and benefits and as to why it is so widely used as a policymaking tool – while it might signal openness and contribute to accountability, the potential for constructive input and policy learning may be limited in risk regulation. Another challenge is “consultation fatigue”, whereby there are so many moments for participation in the policymaking process that stakeholders become overwhelmed or do not have the expertise or capacity to respond. As argued in the introduction to this special issue, Footnote 95 civil society is not immune to the influence of post-factual narratives, nor to the increasing contestation of expertise as an impartial source of knowledge, which participatory mechanisms can further amplify, especially when the consultation process is captured by a particular group. Complex decisions based on a certain amount of scientific uncertainty, such as the authorisations of vaccines or of pesticides, have often been arenas for controversies and contestation of EU decision-making.

For example, Directive 2009/128/EC establishes a framework to achieve the sustainable use of pesticides that are plant protection products by reducing the risks and impacts of pesticide use on human health and the environment and promoting the use of integrated pest management and of alternative approaches or techniques such as non-chemical alternatives to pesticides.Footnote 96 However, a recent ex post evaluation by the Commission shows that the Directive’s implementation has only been “moderately effective” overall in achieving this objective.Footnote 97 Nonetheless, feedback from online public consultation,Footnote 98 which was open to the general public in January–April 2021, indicated that the Directive might have improved pesticide users’ behaviour when, for example, disposing of empty pesticide containers (by rinsing and sending them to a collection centre for empty pesticide packaging) and in their wearing of gloves and/or facemasks when handling pesticides. Feedback received via targeted surveys was mixed – pesticide users agreed that measures to ensure appropriate storage, handling, dilution and disposal of pesticides have been implemented both at the EU and national level, but non-governmental organisations, consumer organisations and civil society groups believed that the implementation of this requirement was either limited or absent.

The above example raises the question of potential bias in public responses. In short, the issue of “narratives” is particularly relevant to ex post evaluation, which effectively tells a story of policy performance, drawing on qualitative and quantitative data. It attempts to generalise on limited research (selective data) to suggest how policy overall has fared in practice. Of course, this is also relevant to ex ante evaluations and impact assessments (ie the policy construction of policy options and their assessments against criteria such as the economy and the environment).

VI. Conclusions

From a normative perspective, recent procedural developments in the Commission’s evaluation practice have enhanced the throughput legitimacy of EU governance by bringing improvements to accountability, participation, inclusiveness and openness. There have a been a number of trends in institutionalised tools and processes supporting evaluation: digitisation, harmonisation, professionalisation and institutionalisation. Footnote 99 By extension, evaluation findings and any notion of what the EU has delivered (output legitimacy) can be considered to be more reliable and accurate owing to the broader and more inclusive nature of participation in the evaluation cycle.

The Commission has learnt from its own consultation exercises, realising the challenges and risks of open consultation. While favouring more targeted consultation, it has also taken measures to broaden participation and to improve access to all citizens. Securing the participation of a broad range of stakeholders in the evaluation cycle and being obliged to “evaluate first” has arguably reinforced the efficiency and effectiveness of EU legislation (and its policies); as such, the Commission’s procedural innovations have resulted in more responsive evaluation. In risk regulation, there are particular challenges given the technical nature and complexity of policy and legislation in areas where it is difficult to secure citizen participation in consultation exercises.

The Commission would seem to be procedurally transparent, but the inclusive practices that support participatory democracy need further promotion. Challenges remain when it comes to ensuring the inclusiveness and openness of evaluation and consultation processes. Arguably, there is a greater need to be proactively transparent – promoting and educating on the tools and procedures in place, as well as ensuring foresight on evaluation exercises to encourage participation – particularly in areas such as financial and capital markets, the environment and climate change and public health, to better inform citizens of complex policy issues and of the means by which they can play an active role in EU governance.

In the field of risk regulation there may, however, be inherent limits to throughput legitimacy, partly due to procedural constraints, but more owing to issues of complexity and the highly technical nature of EU policies. There may also be more fundamental limits (or objections to) throughput legitimacy in risk regulation, particularly in scientific and technological areas, insofar as it may be datable whether citizens’ views should be listened to over scientific expertise. While the Commission has made considerable efforts to engage citizens in aspects of its “evaluation cycle” through various administrative reforms and the introduction of formalised consultation processes, levels of active engagement often remain low. These limitations on information gathering and feedback implicitly affect the picture painted of policy performance based on the data and opinions of industry and better-resourced stakeholders.

Acknowledgments

My thanks to the special issue editors for their guidance, the article reviewers and workshop participants, as well as Anne-Claire Marangoni (Tetra Tech) and Miroslava Janda (European Commission) and faculty colleagues Francesca Colli and Elissaveta Radulova for their valuable comments and suggestions.

References

1 European Commission, “Communication from the Commission to the European Parliament and Council, the European economic and social committee and the committee of regions: Strengthening the foundations of Smart Regulation – improving evaluation”, 2 October 2013, COM (2013) 686.

2 VE Schmidt, “Democracy and Legitimacy in the European Union Revisited: Input, Output and ‘Throughput’” (2013) 61(1) Political Studies 2–22.

3 On the interconnections between and definitions of openness, transparency and participation, see A Alemanno, “Unpacking the Principle of Openness in EU Law: Transparency, Participation and Democracy” (2014) 39(1) European Law Review 72–90. See also D Curtin and J Mendes, “Transparence et participation: des principes démocratiques pour l’administration de l’Union Européenne” (2011) 137–38 Revue française d’administration publique 101–21; J Mendes, Participation in EU Rule-Making: A Rights-Based Approach (Oxford, Oxford University Press 2011).

4 See, inter alia, H Hofmann and P Leino-Sandberg, “An Agenda for Transparency in the EU” (European Law Blog, 2019) <https://europeanlawblog.eu/2019/10/23/an-agenda-for-transparency-in-the-eu/>.

5 See A Volpato, M Eliantonio and K Wright, introduction to this special issue.

6 C Braun and M Busuioc, “Stakeholder engagement as a conduit for regulatory legitimacy?” (2020) 11 Journal of European Public Policy 1599–611.

7 V Schmidt and M Wood, “Conceptualizing throughput legitimacy: Procedural mechanisms of accountability, transparency, inclusiveness and openness in EU governance” (2019) 97 Public Administration 727–40, 728.

8 M Weimer, Risk Regulation in the Internal Market: Lessons from Agricultural Biotechnology (Oxford, Oxford University Press, 2019).

9 See, inter alia, ibid; T Palonitty and M Eliantonio, “Scientific knowledge in environmental judicial review: Safeguarding effective judicial protection in the EU Member States” (2018) 27(4) European Energy and Environmental Law Review 108–14; A Alemanno, The Shaping of European Risk Regulation by Community Courts (New York, New York University School of Law 2008).

10 See, inter alia, C Joerges, K Ladeur and E Vos, Integrating Scientific Expertise into Regulatory Decision-Making: National Traditions and European Innovations (Baden-Baden, Nomos 1997).

11 Different, for instance, from US scholarship; see E Hammond, “Public Participation in Risk Regulation: The Flaws of Formality” (2016) 1 Utah Law Review 169–92; TO McGarity, “Public Participation in Risk Regulation” (1990) 1 RISK 103–30.

12 A Offermans and A Volpato, “Lessons for Participation from an Interdisciplinary Law and Sustainability Science Approach: The Reform of the Sustainable Use of Pesticides Directive” (2023) European Journal of Risk Regulation, this special issue.

13 Supra, note 11.

14 M Alkin and J King, “The Historical Development of Evaluation Use” (2016) 37(4) Evaluation 568–79.

15 S Borrás and S Højlund, “Evaluation and policy learning: the learner’s perspective” (2015) 54 European Journal of Political Research 99–120.

16 TA Abma, “The Practice and Politics of Responsive Evaluation (2006) 27(1) American Journal of Evaluation 31–43.

17 ibid, p 32.

18 A Teasdale and T Bainbridge, “Europe of Results” (The Penguin Companion to the European Union, 2012) <https://penguincompaniontoeu.com/additional_entries/europe-of-results/>.

19 Schmidt, supra, note 2.

20 ibid.

21 ibid.

22 European Commission, “Better Regulation: Joining forces to make better laws” (European Commission Website, 2022) <https://ec.europa.eu/info/sites/default/files/better_regulation_joining_forces_to_make_better_laws_en_0.pdf>.

23 J Beyers and S Arras, “Stakeholder consultations and the legitimacy of regulatory decision-making: A survey experiment in Belgium” (2021) 15(3) Regulation & Governance 877–93; A Binderkrantz, J Blom-Hansen, M Baekgaard and S Serritzlew, “Stakeholder consultations in the EU Commission: instruments of involvement or legitimacy?” (2022) Journal of European Public Policy, DOI: 10.1080/13501763.2022.2058066; Braun and Busuioc, supra, note 6.

24 A Bunea, “Designing stakeholder consultations: Reinforcing or alleviating bias in the European Union system of governance?” (2017) 56(1) European Journal of Political Research 46–69; AS Binderkrantz, J Blom-Hansen and R Senninger, “Countering Bias? The EU Commission’s Consultation with Interest Groups” (2021) 28(4) Journal of European Public Policy 469–88.

25 A Deligiaouri and J Suiter, “Evaluation of public consultations and citizens’ participation in 2015 Better Regulation Agenda of the EU and the need for a deliberative e-rulemaking initiative in the EU” (2021) 22(1) European Politics and Society 69–87.

26 E Lironi and D Peta, “EU public consultations in the digital age: Enhancing the role of the EESC and civil society organisations” (2021) Study prepared for the European Economic and Social Committee (EESC) <https://www.eesc.europa.eu/sites/default/files/files/qe-07-17-001-en-n.pdf>.

27 European Court of Auditors, “‘Have Your Say!’: Commission’s Public Consultations engage Citizens, but fall short on Outreach Activities.” Special Report of the European Court of Auditors (Luxembourg, Publications Office of the European Union 2019) <https://www.eca.europa.eu/Lists/ECADocuments/SR19_14/SR_Public_participation_EN.pdf>.

28 Braun and Busuioc, supra, note 6.

29 S Blockmans and S Russack (eds), Deliberative Democracy in the EU: Countering Populism with Participation and Debate (Brussels, CEPS Rowman & Littlefield International, 2020).

30 O Notte and D Salles, “Involving Citizens as Water Policy Monitors: The European Water Framework Directive Consultation Process in Adour-Garonne” (2011) 33 Politique Européenne 37–62.

31 Conference on the Future of Europe, “About the Conference” (FutureEU, 2022) <https://futureu.europa.eu/?locale=en>.

32 E Radulova, A Nastase and J Juntson, “Interest Aggregation in the Policy-Shaping Stage of EU Decision-Making: An Exploration into the Commission’s Proclivity to Outsource the Analysis of the Collected Input from Public Consultations.” Conference paper presented at the annual European Consortium for Political Research conference, Wroclaw, Poland, September 2019.

33 P Dąbrowska, “Civil Society Involvement in the EU Regulations on GMOs: From the Design of a Participatory Garden to Growing Trees of European Public Debate” (2007) 3(3) Journal of Civil Society 287–304.

34 T Persson, “Democratizing European Chemicals Policy: Do Consultations Favour Civil Society Participation?” (2007) 3(3) Journal of Civil Society 223–38.

35 Schmidt, supra, note 2; Schmidt and Wood, supra, note 7.

36 J Tallberg and M Zurn, “The legitimacy and legitimation of international organizations: introduction and framework” (2019) 14(4) Review of International Organizations 581–606.

37 Schmidt and Wood, supra, note 7, 729.

38 Schmidt, supra, note 2.

39 Schmidt and Wood, supra, note 7.

40 J Steffek, “The limits of proceduralism: Critical remarks on the rise of ‘throughput legitimacy’” (2018) 97 Public Administration 785.

41 J Waldron, “The concept and the rule of law” (2008) 43 Georgia Law Review 1–61, cited in Steffek, supra, note 40.

42 M Bovens, T Schillemans and P ’t Hart, “Does public accountability work? An assessment tool” (2008) 86 Public Administration 225–42.

43 M Bovens, T Schillemans and R Goodin, “Public Accountability” in M Bovens, R Goodin and T Schillemans (eds), The Oxford Handbook of Public Accountability (Oxford, Oxford University Press 2014) pp 1–26. Schmidt and Wood, supra, note 7, 731.

44 Schmidt and Wood, supra, note 7, 732; A Wille, “The European Commission’s accountability paradox” in M Bovens, D Curtin and P ’t Hart (eds), The Real World of EU Accountability (Oxford, Oxford University Press 2010) pp 63–86.

45 Schmidt and Wood, supra, note 7.

46 Schmidt, supra, note 2.

47 A Heritier, “Composite democracy in Europe. The role of transparency and access to information” (2003) 10 Journal of European Public Policy 814–33.

48 C Pateman, “Participatory democracy revisited” (2012) 10 Perspectives on Politics 7–19.

49 V Caby and L Frehen, “How to Produce and Measure Throughput Legitimacy? Lessons from a Systematic Literature Review” (2021) 9(1) Politics and Governance 226–36.

50 Steffek, supra, note 40.

51 ibid, 784.

52 European Commission, supra, note 22.

53 Ted eTendering, “eTendering Home” (Ted eTendering Website, 2022) <https://etendering.ted.europa.eu/general/page.html?name=home>.

54 ibid.

55 Publications Office of the European Union, “Home” (Publication Office of the European Union Website, 2022) <https://op.europa.eu/en/home>.

56 Interview with evaluator, July 2022.

57 ibid.

58 Publications Office of the European Union, supra, note 55.

59 S Smismans, “Policy Evaluation in the EU: The Challenges of Linking Ex Ante and Ex Post Appraisal” (2015) 6(1) European Journal of Risk Regulation 6–26.

60 European Commission, “Better Regulation: why and how” (European Commission Website, 2022), <https://ec.europa.eu/info/law/law-making-process/planning-and-proposing-law/better-regulation-why-and-how_en>.

61 European Commission, “Commission Staff Working Document: Better Regulation Guidelines”, 7 July 2017, SWD(2017) 350 final.

62 European Institution of Public Administration (EIPA), “Revision of Better Regulations Guidelines & Toolbox: What’s Changed?” (EIPA Website, 2022) <https://www.eipa.eu/blog/revision-of-better-regulation-guidelines-toolbox-whats-changed/>.

63 Ibid; European Commission, “Consultations” (European Commission Website, 2021) <https://ec.europa.eu/info/consultations>.

64 European Commission, “Planned Evaluations” (European Commission Website, 2022) <https://ec.europa.eu/info/planned-evaluations_en>; European Commission, “Better Regulation: Roadmap/inception impact assessment” (European Commission Website, 2017) <http://ec.europa.eu/smart-regulation/roadmaps/index_en.htm>.

65 European Commission, “Fifth Progress Report on the Implementation of the EU Security Union Strategy” (European Commission Website, 2022) <https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52022DC0745>.

67 European Commission, “EU Have Your Say: Taking Stock of the Commission’s Better Regulation Agenda” (European Commission Website, 2019) <https://ec.europa.eu/info/sites/default/files/better-regulation-taking-stock-swd_en_0.pdf>.

68 EASL, “EASL’s response to the European Commission call for evidence on cancer screening” (EASL Website, 2022) <https://easl.eu/publication/easls-response-to-the-european-commission-call-for-evidence-on-cancer-screening/>.

69 European Commission, “Ship recycling – European list of ship recycling facilities (10th update)” (European Commission Website, 2022) <https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13475-Ship-recycling-European-list-of-ship-recycling-facilities-10th-update-_en>.

70 European Commission, “Cyber Resilience Act – new cybersecurity rules for digital products and services” (European Commission Website, 2022) <https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13410-Cyber-resilience-act-new-cybersecurity-rules-for-digital-products-and-ancillary-services_en>.

71 EIPA, supra, note 62.

72 European Commission, “Communication from the Commission to the European Parliament, the Council, the European Economic and social committee and the Committee of regions: Union of Equality: Strategy for the Rights of Persons with Disabilities 2021–2030”, 3 March 2021, COM(2021) 101 final.

73 European Commission, “Shaping Europe’s digital future: Audiovisual and Media Services” (European Commission Website, 2022) <https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52021DC0101&from=EN>.

74 European Parliament, “Legislative train schedule” (European Parliament Website, 2022) <https://www.europarl.europa.eu/legislative-train/>.

75 Europa.EU, “Transparency Register: Home” (Europa.EU Website, 2022) <https://ec.europa.eu/budget/financial-transparency-system/analysis.html>.

76 Publications Office of the European Union, “EU Publications” (Publications Office of the European Union, 2022) <https://op.europa.eu/en/web/general-publications>.

77 European Court of Auditors, “Search Publications” (European Court of Auditors Website, 2022) <https://www.eca.europa.eu/en/Pages/PublicationSearch.aspx>.

78 European Commission, supra, note 22. The 2021 communications reads: “The Commission as well as the European Parliament and the Council have various databases in which they collect the evidence used in the course of the legislative process. A joint effort to create a common evidence register, the Joint Legislative Portal, would provide anyone interested in EU policymaking with easy access to all the evidence underpinning a given policy initiative. Improved cooperation on a common register would integrate different efforts and allow more effective communication between policymakers at EU and national level.”

79 European Commission, “Financial Transparency System” (European Commission Website, 2022) <https://ec.europa.eu/budget/financial-transparency-system/>.

80 European Commission, “Funding & Tender Opportunities: Single Electronic Data Interchange Area (SEDIA)” (European Commission Website, 2022) <https://ec.europa.eu/info/funding-tenders/opportunities/portal/screen/home>.

81 European Commission, “Have Your Say” (European Commission Website, 2022) <https://ec.europa.eu/info/law/better-regulation/have-your-say_en>.

82 European Commission, “Chapter VII: Guidelines on Stakeholder Consultation” (European Commission Website, 2022) <https://ec.europa.eu/info/sites/default/files/better-regulation-guidelines-stakeholder-consultation.pdf>.

83 On public participation in the greening of the Farm-to-Fork policy and the case of pesticides reduction, see O Ammann and A Bousat, “The Participation of Civil Society in European Union Environmental Law-Making Processes: A Critical Assessment of the European Commission’s Consultations in Connection with the European Climate Law” (2023) European Journal of Risk Regulation, this special issue.

84 OECD, “Better Regulation Practices across the European Union 2022” (OECD Website,2022) <https://www.oecd.org/publications/better-regulation-practices-across-the-european-union-2022-6e4b095d-en.htm>.

85 ibid.

86 European Commission, “Questions and Answers on the Better Regulation Communication” (European Commission Website, 2021) <https://ec.europa.eu/commission/presscorner/detail/en/qanda_21_1902>.

87 European Court of Auditors, supra, note 27.

88 ibid.

89 ibid.

91 European Court of Auditors, supra, note 27.

92 ibid.

93 European Commission, supra, note 22; European Commission, “Drafting of Roadmaps, Evaluation Roadmaps and Inception Impact Assessments” (European Commission Website, 2022) <https://ec.europa.eu/info/sites/default/files/file_import/better-regulation-toolbox-7_en_0.pdf>.

94 ibid.

95 Volpato et al, supra, note 5.

96 European Parliament, “Revision of Directive 2009/128/EC on the sustainable use of pesticides” (Briefing: Implementation Appraisal, 2022) <https://www.europarl.europa.eu/RegData/etudes/BRIE/2022/730353/EPRS_BRI(2022)730353_EN.pdf>.

97 European Commission, “Study supporting the evaluation of Directive 2009/128/EC on the sustainable use of pesticides and impact assessment of its possible revision” (Final Evaluation Report, October 2021) <https://food.ec.europa.eu/system/files/2022-06/pesticides_sud_eval_2022_eval_report.pdf>.

98 European Commission, “Pesticides – sustainable use (updated EU rules)” (European Commission Website, 2021) <https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12413-Sustainable-use-of-pesticides-revision-of-the-EU-rules/public-consultation_en>.

99 This article has not addressed a number of other developments, such as the establishment of the Regulatory Scrutiny Board and the introduction of Fitness Checks or REFIT.