Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-v9fdk Total loading time: 0 Render date: 2024-11-05T06:05:40.433Z Has data issue: false hasContentIssue false

26 - Towards Adaptive Governance in Big Data Health Research

Implementing Regulatory Principles

from Section IIA - Private and Public Dimensions of Health Research Regulation

Published online by Cambridge University Press:  09 June 2021

Graeme Laurie
Affiliation:
University of Edinburgh
Edward Dove
Affiliation:
University of Edinburgh
Agomoni Ganguli-Mitra
Affiliation:
University of Edinburgh
Catriona McMillan
Affiliation:
University of Edinburgh
Emily Postan
Affiliation:
University of Edinburgh
Nayha Sethi
Affiliation:
University of Edinburgh
Annie Sorbie
Affiliation:
University of Edinburgh

Summary

In recent times, biomedical research has begun to tap into larger-than-ever collections of different data types. As a consequence, the notion of health data – data that are of relevance for health-related research or for clinical purposes – is expanding to include a variety of non-clinical data, as well as data provided by research participants themselves through commercially available products such as smartphones and fitness bands. To date, most scholarship and policy on these issues has focused on privacy and data protection. Less attention has been paid to addressing other aspects of the wicked challenges posed by big data health research and even less work has been geared towards the development of novel governance frameworks.  In this chapter, we make the case for adaptive and principle-based governance of big data research. We outline six principles of adaptive governance for big data research and propose key factors for their implementation into effective governance structures and processes.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

26.1 Introduction

In recent times, biomedical research has begun to tap into larger-than-ever collections of different data types. Such data include medical history, family history, genetic and epigenetic data, information about lifestyle, dietary habits, shopping habits, data about one’s dwelling environment, socio-economic status, level of education, employment and so on. As a consequence, the notion of health data – data that are of relevance for health-related research or for clinical purposes – is expanding to include a variety of non-clinical data, as well as data provided by research participants themselves through commercially available products such as smartphones and fitness bands.Footnote 1 Precision medicine that pools together genomic, environmental and lifestyle data represents a prominent example of how data integration can drive both fundamental and translational research in important domains such as oncology.Footnote 2 All of this requires the collection, storage, analysis and distribution of massive amounts of personal information as well as the use of state-of-the art data analytics tools to uncover new disease-related patterns.

To date, most scholarship and policy on these issues has focused on privacy and data protection. Less attention has been paid to addressing other aspects of the wicked challenges posed by Big Data health research and even less work has been geared towards the development of novel governance frameworks.

In this chapter, we make the case for adaptive and principle-based governance of Big Data research. We outline six principles of adaptive governance for Big Data research and propose key factors for their implementation into effective governance structures and processes.

26.2 The Case for Adaptive Principles of Governance in Big Data Research

For present purposes, the term ‘governance’ alludes to a democratisation of administrative decision-making and policy-making or, to use the words of sociologist Anthony Giddens, to ‘a process of deepening and widening of democracy [in which] government can act in partnership with agencies in civil society to foster community renewal and development.’Footnote 3

Regulatory literature over the last two decades has formalised a number of approaches to governance that seem to address some of the defining characteristics of Big Data health research. In particular, adaptive governance and principles-based regulation appear well-suited to tackle three specific features of Big Data research, namely: (1) the evolving, and thus hardly predictable nature of the data ecosystem in Big Data health research – including the fast-paced development of new data analysis techniques; (2) the polycentric character of the actor network of Big Data and the absence of a single centre of regulation; and (3) the fact that most of these actors do not currently share a common regulatory culture and are driven by unaligned values and visions.Footnote 4

Adaptive governance is based on the idea that – in the presence of uncertainty, lack of evidence and evolving, dynamic phenomena – governance should be able to adapt to the mutating conditions of the phenomenon that it seeks to govern. Key attributes of adaptive governance are the inclusion of multiple stakeholders in governance design,Footnote 5 collaboration between regulating and regulated actors,Footnote 6 the incremental and planned incorporation of evidence in governance solutionsFootnote 7 and openness to cope with uncertainties through social learning.Footnote 8 This is attained by planning evidence collection and policy revision rounds in order to refine the fit between governance and public expectations; distributing regulatory tasks across a variety of actors (polycentricity); designing partially overlapping competences for different actors (redundancy); and by increasing participation in policy and management decisions by otherwise neglected social groups. Adaptive governance thus seems to adequately reflect the current state of Big Data health research as captured by the three characteristics outlined above. Moreover, social learning – a key feature of adaptive governance – can help explore areas of overlapping consensus even in a fragmented actor network like the one that constitutes Big Data research.

Principles based regulation (PBR) is a governance approach that emerged in the 1990s to cope with the expansion of the financial services industry. Just as Big Data research is driven by technological innovation, financial technologies (the so-called fintech industry) have played a disruptive role for the entire financial sector.Footnote 9 Unpredictability, accrual of new stakeholders and lack of regulatory standards and best practices characterise this phenomenon. To respond to this, regulators such as the UK Financial Services Authority (FSA), backed-up by a number of academic supporters of ‘new governance’ approaches,Footnote 10 have proposed principles-based regulation as a viable governance model.Footnote 11 In this model, regulation and oversight relies on broadly-stated principles that reflect regulators orientations, values and priorities. Moreover, implementation of the principles is not entirely delegated to specified rules and procedures. Rather, PBR relies on regulated actors to set up mechanism to comply with the principles.Footnote 12 Principles are usually supplemented by guidance, white papers and other policies and processes to channel the compliance efforts of regulated entities. See further on PBR, Sethi, Chapter 17, this volume.

We contend that PBR is helpful to set up Big Data governance in the research space because it is explicitly focussed on the creation of some form of normative alignment between the regulator and the regulated; it creates conditions that can foster the emergence of shared values among different regulated stakeholders. Since compliance is not rooted on box-ticking nor respect for precisely-specified rules, PBR stimulates experimentation with a number of different oversight mechanisms. This bottom-up approach allows stakeholders to explore a wide range of activities and structures to align with regulatory principles, favouring the selection of more cost-efficient and proportionate mechanisms. Big data health research faces exactly this need to create stakeholders’ alignment and to cope with the wide latitude of regulatory attitudes that is to be expected in an innovative domain with multiple newcomers.

The governance model that we propose below relies on both adaptive governance – as to its capacity to remain flexible to future evolutions of the field – and PBR – because of its emphasis on principles as sources of normative guidance for different stakeholders.

26.3 A Framework to Develop Systemic Oversight

The framework we propose below provides guidance to actors that have a role in the shaping and management of research employing Big Data; it draws inspiration from the above-listed features of adaptive governance. Moreover, it aligns with PBR in that it offers guidance to stakeholders and decision-makers engaged at various levels in the governance of Big Data health research. As we have argued elsewhere, our framework will facilitate the emergence of systemic oversight functions for the governance of Big Data health research.Footnote 13 The development of systemic oversight relies on six high-order principles aimed at reducing the effects of a fragmented governance landscape and at channelling governance decisions – through both structures and processes – towards an ethically defensible common ground. These six principles do not predefine which specific governance structures and processes shall be put in place – hence the caveat that they represent high-order guides. Rather, they highlight governance features that shall be taken into account in the design of structures and processes for Big Data health research. Equally, our framework is not intended as a purpose-neutral approach to governance. Quite to the contrary; the six principles we advance do indeed possess a normative character in that they endorse valuable states of affairs that shall occur as a result of appropriate and effective governance. By the same token, our framework suggests that action should be taken in order to avoid certain kinds of risks that will most likely occur if left unattended. In this section, we will illustrate the six principles of systemic oversight – adaptivity, flexibility, monitoring, responsiveness, reflexivity and inclusiveness – while the following section deals with the effective interpretation and implementation of such principles in terms of both structures and processes.

Adaptivity: adaptivity is the capacity of governance structures and processes to ensure proper management of new forms of data as they are incorporated into health research practices. Adaptivity, as presented here, has also been discussed as a condition for resilience, that is, for the capacity of any given system to ‘absorb disturbances and reorganize while undergoing change so as to still retain essentially the same function, structure, identity and feedbacks.’Footnote 14 This feature is crucial in the case of a rapidly evolving field – like Big Data research – whose future shape, as a consequence, is hard to anticipate.

Flexibility: flexibility refers to the capacity to treat different data types depending on their actual use rather than their source alone. Novel analytic capacities are jeopardising existing data taxonomies, which rapidly renders regulatory categories constructed around them obsolete. Flexibility means, therefore, recognising the impact of technical novelties and, at a minimum, giving due consideration to their potential consequences.

Monitoring: risk minimisation is a crucial aim of research ethics. With the possible exception of highly experimental procedures, the spectrum of physical and psychological harms due to participation in health research is fairly straightforward to anticipate. In the evolving health data ecosystem described so far, however, it is difficult to anticipate upfront what harms and vulnerabilities research subjects may encounter due their participation in Big Data health research. This therefore requires on-going monitoring.

Responsiveness: despite efforts in monitoring emerging vulnerabilities, risks can always materialise. In Big Data health research, privacy breaches are a case in point. Once personal data are exposed, privacy is lost. No direct remedy exists to re-establish the privacy conditions that were in place before the violation. Responsiveness therefore prescribes that measures are put in place to at least reduce the impact of such violations on the rights, interests and well-being of research participants.

Reflexivity: it is well known that certain health-related characteristics cluster in specific human groups, such as populations, ethnic groups, families and socio-economic strata. Big data are pushing the classificatory power of research to the next level, with potentially worrisome implications. The classificatory assumptions that drive the use of rapidly evolving data-mining capacities need to be put under careful scrutiny as to their plausibility, opportunity and consequences. Failing to do so will result in harms to all human groups affected by those assumptions. What is more, public support for, as well as trust in, scientific research may be jeopardised by the reputational effects that can arise if reflexivity and scrutiny are not maintained.

Inclusiveness: the last component of systemic oversight closely resonates with one of the key features of adaptive governance, that is, the need to include all relevant parties in the governance process. As more diverse data sources are aggregated, the more difficult it becomes for research participants to exert meaningful control on the expanding cloud of personal data that is implicated by their participation.Footnote 15 Experimenting with new forms of democratic engagement is therefore imperative for a field that depends on resources provided by participants (i.e. data), but that, at the same time, can no longer anticipate how such resources will be employed, how they will be analysed and with which consequences. See Burgess, Chapter 25.

These six principles can be arranged to form the acronym AFIRRM: our model framework for the governance of Big Data health research.

26.4 Big Data Health Research: Implementing Effective Governance

While there is no universal definition of the notion of effective governance, it alludes in most cases to an alignment between purposes and outcomes, reached through processes that fulfil constituents’ expectations and which project legitimacy and trust onto the involved actors.Footnote 16 This understanding of effective governance fits well with our domain of interest: Big Data health research. In the remainder of this chapter, drawing on literature on the implementation of adaptive governance and PBR, we discuss key issues to be taken into account in trying to derive effective governance structures and oversight mechanism from the AFIRRM principles.

The AFIRRM framework endorses the use of principles as high-level articulations of what is to be expected by regulatory mechanisms for the governance of Big Data health research. Unlike the use of PBR in financial markets where a single regulator expects compliance, PBR in the Big Data context responds to the reality that governance functions are distributed among a plethora of actors, such as ethics review committees, data controllers, privacy commissioners, access committees, etc. PBR within the AFIRRM framework offers a blueprint for such a diverse array of governance actors to create new structures and processes to cope with the specific ethical and legal issues raised by the use of Big Data. Such principles have a generative function in the governance landscape, that is, in the process of being created to govern those issues.

The key advantage of principles in this respect is that they require making the reason behind regulation visible to all interested parties, including publics. This amounts to an exercise of public accountability that can bring about normative coherence among actors with different starting assumptions. The AFIRRM principles stimulate a bottom-up exploration of the values at stake and how compliance with existing legal requirements will be met. In this sense, the AFIRRM principles perform a formal, more than a substantive function, precisely because we assume the substantive ethical and legal aims of regulation that have already been developed in health research – such as the protection of research participants from the risk of harm – to hold true also for research employing Big Data. What AFIRRM principles do is to provide a starting point for deliberation and action that respects existing ethical standards and complies with pre-existing legal rules.

The AFIRRM principles do not envision actors in the space of Big Data research to self-regulate, but they do presuppose trust between regulators and regulated entities: regulators need to be confident that regulated entities will do their best to give effect to the principles in good faith. While some of the interests at stake in Big Data health research might be in tension – like the interest of researchers to access and distribute data, and the interests of data donors to control what their personal data are used for – developing efficient governance structures and processes that meet stakeholders’ expectations is of advantage for all interested parties to begin with conversations based on core agreed principles. Practically, this requires all relevant stakeholders to have a say in the development and operationalisation of the principles at stake.

Adaptive governance scholarship has identified typical impediments to effective operationalisation of adaptive mechanisms. A 2012 literature review of adaptive governance, network management and institutional analysis identified three key challenges to the effective implementation of adaptive governance: ill-defined purposes and objectives, unclear governance context and lack of evidence in support of blueprint solutions.Footnote 17

Let us briefly illustrate each of these challenges and explain how systemic oversight tries to avoid them. In the shift from centralised forms of administration and decision-making, to less formalised and more distributed governance networks that occurred over the last three decades,Footnote 18 the identification of governance objectives is no longer straightforward. This difficulty may also be due to the potentially conflicting values of different actors in the governance ecosystem. In this respect, systemic oversight has the advantage of not being normatively neutral. The six principles of systemic oversight determinedly aim at fostering an ethical common ground for a variety of governance actors and activities in the space of Big Data research. What underpins the framework, therefore, is a view of what requires ethical attention in this rapidly evolving field, and how to prioritise actions accordingly. In this way, systemic oversight can provide orientation for a diverse array of governance actors (structures) and mechanisms (processes), all of which are supposed to produce an effective system of safeguards around activities in this domain. Our framework directs attention to critical features of Big Data research and promotes a distributed form of accountability that will, where possible, emerge spontaneously from the different operationalisations of its components. The six components of systemic oversight, therefore, suggest what is important to take into account when considering how to adapt the composition, mandate, operations and scope of oversight bodies in the field of Big Data research.

The second challenge to effective adaptive governance – unclear governance context – refers to the difficulty of mapping the full spectrum of rules, mechanisms, institutions and actors involved in a distributed governance system or systems. Systemic oversight requires mapping the overall governance context in order to understand how best to implement the framework in practice. This amounts to an empirical inquiry into the conditions (structures, mechanisms and rules) in which governance actors currently operate. In a recent study we showed that current governance mechanisms for research biobanks, for instance, are not aligned with the requirements of systemic oversight.Footnote 19 In particular, we showed that systemic oversight can contribute to improve accountability of research infrastructures that, like biobanks, collect and distribute an increasing amount of scientific data.

The third and last challenge to effective operationalisation of adaptive mechanisms has to do with the limits of ready-made blueprint solutions to complex governance models. Political economist and Nobel Laureate Elinor Ostrom has written extensively on this. In her work on socio-ecological systems, Ostrom has convincingly shown that policy actors have the tendency to buy into what she calls ‘policy panaceas’,Footnote 20 that is, ready-made solutions to very complex problems. Such policy panaceas are hardly ever supported by solid evidence regarding the effectiveness of their outcomes. One of the most commonly cited reasons for their lack of effectiveness is that complexity entails high degrees of uncertainty as to the very phenomenon that policy makers are trying to govern.

We saw that uncertainty is characteristic of Big Data research too (see Section 26.2). That is why systemic oversight refrains from prescribing any particular governance solution. While not rejecting traditional predict-and-control approaches (such as informed consent, data anonymisation and encryption), systemic oversight does not put all the regulatory weight on any particular instrument or body. The systemic ambition of the framework lies in its pragmatic orientation towards a plurality of tools, mechanisms and structures that could jointly stabilise the responsible use of Big Data for research purposes. In this respect, our framework acknowledges that ‘[a]daptation typically emerges organically among multiple centers of agency and authority in society as a relatively self-organized or autonomous process marked by innovation, social learning and political deliberation’.Footnote 21

Still, a governance framework’s capacity to avoid known bottlenecks to operationalisation is a necessary but not a sufficient condition to its successful implementation. The further question is how the principles of the systemic oversight model can be incorporated into structures and processes in Big Data research governance. With structures we mean actors and networks of actors involved in governance, and organised in bodies charged with oversight, organisational or policy-making responsibilities. Processes, instead, are the mechanisms, procedures, rules, laws and codes through which actors operate and bring about their governance objectives. Structures and processes define the polycentric, redundant and experimental system of governance that an adaptive governance model intends to promote.Footnote 22

26.5 Key Features of Governance Structures and Processes

Here we follow the work of Rijke and colleaguesFootnote 23 in identifying three key properties of adaptive governance structures: centrality, cohesion and density. While it is acknowledged that centralised structures can be effective as a response to crises and emergencies, centralisation is precisely a challenge in Big Data; our normative response is to call for inclusive social learning among the broad array of stakeholders, subject to challenges of incomplete representation of relevant interests (see further below). Still, this commitment can help to promote network cohesion by fostering discussion about how to implement the principles, while also promoting the formation of links between governance actors, as required by density. In addition, this can help to ensure that governance roles are fairly distributed among a sufficiently diverse array of stakeholders and that, as a consequence, decisions are not hijacked by technical experts.

The governance space in Big Data research is already populated by numerous actors, such as IRBs, data access committees and advisory boards. These bodies are not necessarily inclusive of a sufficiently broad array of stakeholders and therefore they may not be very effective at promoting social learning. Their composition could thus be rearranged in order to be more representative of the interests at stake and to promote continued learning. New actors could also enter the governance system. For instance, data could be made available for research by data subjects themselves through data platforms.Footnote 24

Network of actors (structures) operating in the space of health research do so through mechanisms and procedures (processes) such as informed consent and ethics review, as well as data access review, policies on reporting research findings to participants, public engagement activities and privacy impact assessment.

Processes are crucial to effective governance of health research and are a critical component of the systemic oversight approach as their features can determine the actual impact of its principles. Drawing on scholarship in adaptive governance, we present three such features (components) that are central to the appropriate interpretation of the systemic oversight principles.

Social learning: social learning refers to learning that occurs by observing others.Footnote 25 In governance settings that are open to participation by different stakeholders, social learning can occur across different levels and hierarchies of the governance structures. According to many scholars, including Ostrom,Footnote 26 social learning represents an alternative to policy blueprints (see above) – especially when it is coupled with and leading to adaptive management. Planned adaptations – that is, previously scheduled rounds of policy revision in light of new knowledge – can be occasions for governance actors to capitalise on each other’s experience and learn about evolving expectations and risks. Such learning exercises can reduce uncertainty and lead to adjustments in mechanisms and rules. The premise of this approach is the realisation that in complex systems characterised by pronounced uncertainty, ‘no particular epistemic community can possess all the necessary knowledge to form policy’.Footnote 27 Social learning – be it aimed at gathering new evidence, at fostering capacity building or at assessing policy outcomes – is relevant to all of the six components of systemic oversight. The French law on bioethics, for instance, prescribes periodic rounds of nationwide public consultation – the so-called Estates General on bioethics.Footnote 28 This is an example of how social learning can be fostered. Similar social learning can be triggered even at smaller scales – for instance in local oversight bodies – in order to explore new solutions and alternative designs.

Complementarity: complementarity is the capacity of governance processes to fulfil both the need for processes to be functionally compatible and to ensure procedural correspondence between processes and the phenomena they intend to regulate. Functional complementarity refers to the distribution of regulatory functions across a given set of processes exhibiting partial overlap (see redundancy, above). This feature is crucial for both monitoring and reflexivity. Procedural complementarity, on the other hand, refers to the temporal alignment between governance processes and the activities that depend on such processes. One prominent example, in this respect, is the timing of ethics review processes, or that of data access requests processing.Footnote 29 For instance, the European General Data Protection Regulation (GDPR) prescribes a maximum 72-hour delay between detection and notification of privacy breaches. This provision is an example of procedural complementarity that would be of the utmost importance for the principle of responsiveness.

Visibility: governance processes need to be visible, that is, procedures and their scope need to be as publicly available as possible to whomever is affected by them or must act accordingly to them. The notion of regulatory visibility has recently been highlighted by Laurie and colleagues, who argue for regulatory stewardship within ecosystems to help researchers clarify values and responsibilities in health research and navigate the complexities.Footnote 30 Recent work also demonstrates that currently it is difficult to access policies and standard operating procedures of prominent research institutions like biobanks. In principle, fair scientific competition may militate against disclosure of technical details about data processing, but it is hard to imagine practical circumstances in which administrators of at least publicly funded datasets would not have incentives to share as much information as possible regarding the way they handle their data. Process visibility goes beyond fulfilling a pre-determined set of criteria (for instance, for auditing purposes). By disclosing governance processes and opportunities for engagement, actors actually offer reasons to be trusted by a variety of stakeholders.Footnote 31 This feature is of particular relevance for the principles of monitoring and reflexivity, as well as to improve the effectiveness of inclusive governance processes.

26.6 Conclusion

In this chapter, we have defended adaptive governance as a suitable regulatory approach for Big Data health research by proposing six governance principles to foster the development of appropriate structures and processes to handle critical aspects of Big Data health research. We have analysed key aspects of implementation and identified a number of important features that can make adaptive regulation operational. However, one might legitimately ask: in the absence of a central regulatory actor endowed with clearly recognised statutory prerogatives, how can it be assumed that the AFIRRM principles will be endorsed by the diverse group of stakeholders operating in the Big Data health research space? Clearly, this question does not have a straightforward answer. However, to increase likelihood of uptake, we have advanced AFIRRM as a viable and adaptable model for the creation of necessary tools that can deliver on common objectives. Our model is based on a careful analysis of regulatory scholarship vis-à-vis the key attributes of this type of research. We are currently undertaking considerable efforts to introduce AFIRRM to regulators, operators and organisations in the space of research or health policy. We are cognisant of the fact that the implementation of a model like AFIRRM needs not be temporally linear. Different actors may take initiative at different points in time. It cannot be expected that a coherent system of governance will emerge in a synchronically orchestrated manner through the uncoordinated action of multiple stakeholders. Such a path could only be imagined if a central regulator had the power and the will to make it happen. Nothing indicates, however, that regulation will assume a centralised character anytime soon. Nevertheless, polycentricity is not in itself a barrier to the emergence of a coherent governance ecosystem. Indeed, the AFIRRM principles – in line with its adaptive orientation – rely precisely on polycentric governance to cope with the uncertainty and complexity of Big Data health research.

Footnotes

1 fitbit Inc., ‘National Institutes of Health Launches Fitbit Project as First Digital Health Technology Initiative in Landmark All of Us Research Program (Press Release)’, (fitbit, 2019).

2 D. C. Collins et al., ‘Towards Precision Medicine in the Clinic: From Biomarker Discovery to Novel Therapeutics’, (2017) Trends in Pharmacological Sciences, 38(1), 2540.

3 A. Giddens, The Third Way: The Renewal of Social Democracy (New York: John Wiley & Sons, 2013), p. 69.

4 E. Vayena and A. Blasimme, ‘Health Research with Big Data: Time for Systemic Oversight’, (2018) The Journal of Law, Medicine & Ethics, 46(1), 119129.

5 C. Folke et al., ‘Adaptive Governance of Social-Ecological Systems’, (2005) Annual Review of Environment and Resources, 30, 441473.

6 T. Dietz et al., ‘The Struggle to Govern the Commons’, (2003) Science, 302(5652), 19071912.

7 C. Ansell and A. Gash, ‘Collaborative Governance in Theory and Practice’, (2008) Journal of Public Administration Research and Theory, 18(4), 543571.

8 J. J. Warmink et al., ‘Coping with Uncertainty in River Management: Challenges and Ways Forward’, (2017) Water Resources Management, 31(14), 45874600.

9 R. J. McWaters et al., ‘The Future of Financial Services-How Disruptive Innovations Are Reshaping the Way Financial Services Are Structured, Provisioned and Consumed’, (World Economic Forum, 2015).

10 R. A. W. Rhodes, ‘The New Governance: Governing without Government’, (1996) Political Studies, 44(4), 652667.

11 J. Black, ‘The Rise, Fall and Fate of Principles Based Regulation’, (2010) LSE Legal Studies Working Paper, 17.

13 Vayena and Blasimme, ‘Health Research’.

14 B. Walker et al., ‘Resilience, Adaptability and Transformability in Social–Ecological Systems’, (2004) Ecology and Society, 9 (2), 4.

15 E. Vayena and A. Blasimme, ‘Biomedical Big Data: New Models of Control over Access, Use and Governance’, (2017) Journal of Bioethical Inquiry, 14(4), 501513.

16 See, for example, S. Arjoon, ‘Striking a Balance between Rules and Principles-Based Approaches for Effective Governance: A Risks-Based Approach’, (2006) Journal of Business Ethics, 68(1), 5382; A. Kezar, ‘What Is More Important to Effective Governance: Relationships, Trust, and Leadership, or Structures and Formal Processes?’, (2004) New Directions for Higher Education, 127, 3546.

17 J. Rijke et al., ‘Fit-for-Purpose Governance: A Framework to Make Adaptive Governance Operational’, (2012) Environmental Science & Policy, 22, 7384.

18 R. A. W. Rhodes, Understanding Governance: Policy Networks, Governance, Reflexivity, and Accountability (Buckingham: Open University Press, 1997); R. A. W. Rhodes, ‘Understanding Governance: Ten Years On’, (2007) Organization Studies, 28(8), 12431264.

19 F. Gille et al. ‘Future-proofing biobanks’ governance’,  (2020) European Journal of Human Genetics, 28, 989–996.

20 E. Ostrom, ‘A Diagnostic Approach for Going beyond Panaceas’, (2007) Proceedings of the National Academy of Sciences, 104(39), 1518115187.

21 D. A. DeCaro et al., ‘Legal and Institutional Foundations of Adaptive Environmental Governance’, (2017) Ecology and Society: A Journal of Integrative Science for Resilience and Sustainability, 22 (1), 1.

22 B. Chaffin et al., ‘A Decade of Adaptive Governance Scholarship: Synthesis and Future Directions’, (2014) Ecology and Society, 19(3), 56.

23 Rijke et al., ‘Fit-for-Purpose Governance’.

24 A. Blasimme et al., ‘Democratizing Health Research Through Data Cooperatives’, (2018) Philosophy & Technology, 31(3), 473479.

25 A. Bandura and R. H. Walters, Social Learning Theory, vol. 1 (Prentice-hall Englewood Cliffs, NJ, 1977).

26 Ostrom, ‘A Diagnostic Approach’.

27 D. Swanson et al., ‘Seven Tools for Creating Adaptive Policies’, (2010) Technological Forecasting and Social Change, 77(6), 924939, 925.

28 D. Berthiau, ‘Law, Bioethics and Practice in France: Forging a New Legislative Pact’, (2013) Medicine, Health Care and Philosophy, 16(1), 105113.

29 G. Silberman and K. L. Kahn, ‘Burdens on Research Imposed by Institutional Review Boards: The State of the Evidence and Its Implications for Regulatory Reform’, (2011) The Milbank Quarterly, 89(4), 599627.

30 G. T. Laurie et al., ‘Charting Regulatory Stewardship in Health Research: Making the Invisible Visible’, (2018) Cambridge Quarterly of Healthcare Ethics, 27(2), 333347.

31 O. O’Neill, ‘Trust with Accountability?’, (2003) Journal of Health Services Research & Policy, 8(1), 34.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×