Article contents
The Challenge of Regulating Clinical Decision Support Software After 21st Century Cures
Published online by Cambridge University Press: 06 January 2021
Abstract
- Type
- Articles
- Information
- American Journal of Law & Medicine , Volume 44 , Issue 2-3: Symposium - The 21st Century Cures Act: A Cure for the 21st Century? , May 2018 , pp. 237 - 251
- Copyright
- Copyright © American Society of Law, Medicine and Ethics and Boston University 2018
References
1 See Software as a Medical Device (SaMD): Key Definitions, Int'l Med. Device Regulator's Forum (Dec. 9, 2013), http://www.imdrf.org/docs/imdrf/final/technical/imdrf-tech-131209-samd-key-definitions-140901.pdf (distinguishing software in a device from software as a device).
2 Id.; see also What are examples of Software as a Medical Device?, U.S. Food & Drug Admin., https://www.fda.gov/medicaldevices/digitalhealth/softwareasamedicaldevice/ucm587924.htm. (last updated Dec. 6, 2017) [https://perma.cc/XG9S-8A53].
3 Id.
4 See 21st Century Cures Act, Pub. L. No. 114-255, § 3060(a), 130 Stat. 1033 (2016).
5 See 21 U.S.C. § 321(h) (codifying § 201(h) of the Food, Drug, and Cosmetic Act, which defines the medical devices that Congress has authorized FDA to regulate).
6 See 21 U.S.C. § 360j(o)(1) (excluding five categories of software from the definition of a medical device).
7 Id. at § 360j(o)(1)(A).
8 Id. at § 360j(o)(1)(B).
9 Id. at § 360j(o)(1)(C)(iii).
10 Id. at § 360j(o)(1)(D).
11 Id. at § 360j(o)(1)(E).
12 See Jeffrey K. Shapiro & Jennifer D. Newberger, Highlights of Medical Device Related Provision of 21 st Century Cures Act, FDA L. Blog (Dec. 8, 2016), http://www.fdalawblog.net/2016/12/highlights-of-medical-device-related-provision-in-the-21st-century-cures-act/ [https://perma.cc/XMN3-VPX6] (noting that administrative support software arguably never was a medical device in the first place).
13 See R. Polk Wagner, The Medium is the Mistake: The Law of Software for the First Amendment, 51 Stan. L. Rev. 387 (1999).
14 See Shapiro & Newberger, supra note 12 (noting that the Cures Act's exclusion for software for maintaining and encouraging a healthy lifestyle is “non-controversial and is consistent with FDA's General Wellness and Mobile Medical Applications guidance documents” and noting that the exclusion for electronic health record systems “codifies policy already implemented by FDA as a matter of enforcement discretion” and that its exclusion for software for transferring, storing, formatting, or displaying data merely codifies a prior FDA exemption for medical device data systems).
15 21 U.S.C. § 360j(o)(1)(E).
16 Clinical and Patient Decision Support Software: Draft Guidance for Industry and Food and Drug Administration Staff, Food & Drug Admin. (Dec. 8, 2017), https://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM587819.pdfs [hereinafter Draft Guidance]; see also Clinical Decision Support (CDS), HealthIT.gov (last updated Jan. 15, 2013), https://www.healthit.gov/policy-researchers-implementers/clinical-decision-support-cds [https://perma.cc/JWV8-YUGQ] (noting that CDS, “provides clinicians, staff, patients or other individuals with knowledge and person-specific information, intelligently filtered or presented at appropriate times, to enhance health and health care. CDS encompasses a variety of tools to enhance decision-making in the clinical workflow. These tools include computerized alerts and reminders to care providers and patients; clinical guidelines; condition-specific order sets; focused patient data reports and summaries; documentation templates; diagnostic support, and contextually relevant reference information, among other tools.”).
17 See 21 U.S.C. § 360j(o)(1)(E).
18 Id.
19 Bradley Merrill Thompson, Learning from Experience: FDA's Treatment of Machine Learning, Mobile Health News (Aug. 23, 2017), http://www.mobihealthnews.com/content/learning-experience-fda%E2%80%99s-treatment-machine-learning [https://perma.cc/Q95C-9R22].
20 Draft Guidance, supra note 16, at 8.
21 Id. at 7.
22 See 21 U.S.C. § 360j(o)(1)(E)(i) (describing software that uses patient specific information “or” general medical information). But see Draft Guidance, supra note 16, at 7 (sensibly reading this “or” as meaning “and/or”).
23 Id. at § 360j(o)(1)(E)(ii).
24 Id. at § 360j(o)(1)(E)(iii).
25 Epistemology is “the study of knowledge and justified belief. As the study of knowledge, epistemology is concerned with the following questions: What are the necessary and sufficient conditions of knowledge? What are its sources? What is its structure, and what are its limits?” See Stanford Dictionary of Philosophy (Dec. 14, 2005), https://plato.stanford.edu/entries/epistemology/ [https://perma.cc/M2KE-22NJ].
26 See, e.g., Smalley, Walter et al., Contraindicated Use of Cisapride: Impact of Food and Drug Administration Regulatory Action, 284 JAMA 3036, 3038 (2000)CrossRefGoogle ScholarPubMed (finding that labeling revisions and efforts to communicate contraindications of the drug cisapride (Propulsid) had little impact on prescribing behavior); Woosley, Raymond L. & Glenn Rice, A New System for Moving Drugs to the Market, 21 Issues Sci. & Tech. Online, 63, 64 (2005)Google Scholar (listing six drugs that had to be withdrawn from the market because of physician non-compliance with warnings and contraindications in labeling).
27 Mello, Michelle M., Of Swords and Shields: The Role of Clinical Practice Guidelines in Medical Malpractice Litigation, 149 U. Pa. L. Rev. 645, 681 (2001)CrossRefGoogle Scholar.
28 See What is Machine Learning? A definition, Expert Sys., http://www.expertsystem.com/machine-learning-definition/ [https://perma.cc/S88G-2REA] (“Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it [to] learn for themselves. The process of learning begins with observations or data, such as examples, direct experience, or instruction, in order to look for patterns in data and make better decisions in the future based on the examples that we provide. The primary aim is to allow the computers [to] learn automatically without human intervention or assistance and adjust actions accordingly.”). See also Libbrecht, Maxwell W. & Noble, William Stafford, Machine Learning in Genetics and Genomics, 16 Nature Rev. Genetics 321, 321 (2015)CrossRefGoogle ScholarPubMed (describing machine learning algorithms as those that “improve with experience” at the task of recognizing patterns in data, to “assist humans in making sense of large, complex data sets”).
29 Tom M. Mitchell, Machine Learning 2 (1997) (noting that “a computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E”).
30 See Cohen, I. Glenn et al., The Legal and Ethical Concerns That Arise from Using Complex Predictive Analytics in Health Care, 33 Health Aff. 1139, 1140 (2014)CrossRefGoogle ScholarPubMed.
31 See id. at 1139-47 (discussing examples of predictive analytics software used in health care settings).
32 See Inst. of Med., The Future of Drug Safety 39, 122 (Alina Baciu et al. eds., 2007) (noting the prevalence and lack of evidence for off-label uses and noting the limitations of clinical trial data to inform decisions about the real-world clinical impacts even for a drug's on-label, indicated uses).
33 Inst. of Med., Conflict of Interest in Medical Research, Education, and Practice 189-215 (Bernard Lo & Marilyn J. Field eds., 2009).
34 See Longhurst, C.A. et al., A ‘Green Button’ for Using Aggregate Patient Data at the Point of Care, 33 Health Aff. 1229, 1229 (2014)CrossRefGoogle ScholarPubMed.
35 Cohen et al., supra note 30, at 1139.
36 See generally Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 Wash. L. Rev. 1, 14-15 (2014).
37 Cohen et al., supra note 30, at 1146-47.
38 See Liu, Jason B. et al., Defining the Intrinsic Cardiac Risks of Operations to Improve Preoperative Cardiac Risk Assessments, 128 Anesthesiology 283, 285 (2018)CrossRefGoogle ScholarPubMed (finding that an established index for predicting the risk of cardiac arrest during surgery was less accurate than an algorithm that incorporates data from over 3 million actual operations).
39 See Peterson, J.F. et al., Physician Response to Implementation of Genotype-Tailored Antiplatelet Therapy, 100 Clinical Pharmacology & Therapeutics 67, 72 (2016)CrossRefGoogle ScholarPubMed (finding various reasons for instances of physician non-compliance with recommendations by a CDS system incorporating pharmogenetic data).
40 21 U.S.C. § 360j(o)(1)(E)(iii).
41 See id.
42 21 C.F.R. § 201.128 (2018) (for drugs); 21 C.F.R. § 801.4 (2017) (for devices).
43 See Clarification of When Products Made or Derived from Tobacco Are Regulated as Drugs, Devices, or Combination Products, 83 Fed. Reg. 2092, 2093 (proposed Jan. 16, 2018) (to be codified at 21 C.F.R. pts. 201, 801, 1100).
44 21 C.F.R. § 801.4 (2018) (defining “intended use” as the “objective intent of the persons legally responsible for the labeling of devices”).
45 Id.
46 See, e.g., National Nutritional Foods Association v. Matthews, 557 F.2d 325 (2d Cir. 1977). But see U.S. v. Travia, 180 F. Supp. 2d 115 (D.D.C. 2001) (finding intent based on actual use in the absence of any claims indicating the manufacturer intended such use, but doing so in a factual setting so unusual—unlawful sale of balloons full of laughing gas at a rock concert—that the precedential value of this holding can be questioned).
47 See Regulation 2016/679, of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), (L 119/1).
48 Council Regulation 2016/679, art. 22, 2018 (L 119/1).
49 Id. at ¶ 2 (allowing decisions to be based solely on automated processing in narrow circumstances, e.g., if authorized by law or if the data subject consents).
50 Id. at ¶¶ 2(b) and (3).
51 Id. at ¶ 3.
52 Council Regulation (defining the special categories of data).
53 Council Regulation 2016/679, art. 22, 2018 (L 119/1) ¶ 4.
54 Bryce Goodman & Seth Flaxman, European Union Regulations on Algorithmic Decision-Making and a “Right To Explanation”, arXiv (June 28, 2016), https://arxiv.org/abs/1606.08813 [https://perma.cc/UT2E-8D76].
55 See Andrew Burt, Is There a Right to Explanation for Machine Learning in the GDPR?, Int'l Ass'n of Privacy Prof'ls (June 1, 2017), https://iapp.org/news/a/is-there-a-right-to-explanation-for-machine-learning-in-the-gdpr/ [https://perma.cc/3879-JEJA] (noting that the existence of a right to explanation “has become a controversial subject. Some scholars, for example, have spoken out vehemently against the mere possibility that such a right exists. Others, such as the UK's own Information Commissioner's Office, seem to think the right is pretty clearly self-evident”). See also Sandra Wachter et al., Why a Right to Explanation of Automated Decision-Making Does Not Exist in The General Data Protection Regulation, Int'l Data Privacy L. (2017), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2903469 [https://perma.cc/VP4K-5R9S].
56 See Goodman & Flaxman, supra note 54, at 6-7 (stating, in connection with neural networks, “what hope is there of explaining the weights learned in a multilayer neural net with a complex architecture?”); Burt, supra note 55 (stating that “if you're a privacy professional, you're going to find it difficult to implement these requirements in practice”).
57 See Goodman & Flaxman, supra note 54, at 6 (noting that, “[s]tandard supervised machine learning algorithms for regression or classification are inherently based on discovering reliable associations / correlations to aid an accurate out-of-sample prediction, with no concern for causal reasoning or “explanation” beyond the statistical sense in which it is possible to measure the amount of variance explained by a predictor.”).
58 See, e.g., Rich Caruana et al., Intelligible Models for Healthcare: Predicting Pneumonia Risk and Hospital 30-day Readmission, Ass'n for Computing Machinery KDD (2015), http://dx.doi.org/10.1145/2783258.2788613 [https://perma.cc/9LT9-TXFB]; A. Fischer, How to Determine the Unique Contributions of Input-Variables to the Nonlinear Regression Function of a Multilayer Perception, 309-310 Ecological Modelling 60, 60 (2015).
59 Albrecht, Tim et al., Deep Learning for Single-Molecule Science, 28 Nanotechnology 1, 9 (2017)CrossRefGoogle ScholarPubMed.
60 Cliff Kuang, Can AI be Taught to Explain Itself?, N.Y. Times (Nov. 21, 2017), https://www.nytimes.com/2017/11/21/magazine/can-ai-be-taught-to-explain-itself.html; see also, Tim Collins, Outrage over AI that “Identifies Gay Faces” as Google Experts Say the Machine Relies on Patterns in How People Take Selfies and NOT on Their Facial Features, Daily Mail (Jan. 15, 2018), http://www.dailymail.co.uk/sciencetech/article-5270365/Google-experts-debunk-sexuality-detecting-AI.html [https://perma.cc/UH28-KE6F].
61 Kuang, supra note 60.
62 Id.
63 Goodman, & Flaxman, , supra note 54 (Burrell, quoting J., How the Machine “Thinks”: Understanding Opacity in Machine-Learning Algorithms, 3 Big Data & Society (2016))Google Scholar.
64 Draft Guidance, supra note 16.
65 See, e.g., Evan Sweeney, After a 6-year wait, FDA's Clinical Decision Support Guidelines Get a Mixed Reaction, Fierce Healthcare (Dec. 7, 2017), https://www.fiercehealthcare.com/regulatory/fda-clinical-decision-support-bradley-merrill-thompson-bethany-hills-ai-21st-century [https://perma.cc/BT9TTK7J].
66 David Gunning, Explainable Artificial Intelligence (XAI), DARPA, https://www.darpa.mil/program/explainable-artificial-intelligence [https://perma.cc/97MG-CSH9].
67 Id. (noting that “the effectiveness of [AI] systems is limited by the machine's current inability to explain their decisions and actions to human users” and describing XAI as new or modified machine-learning techniques that “will produce more explainable models” with “the ability to explain their rationale, characterize their strengths and weaknesses, and convey an understanding of how they will behave in the future”).
68 Draft Guidance, supra note 16, at 8.
69 Id.
70 Id. at 8-9.
71 See generally Susan Nevelow Mart, The Right to Receive Information, 95 L. Libr. J. 175 (2003) (discussing the permissible scope of governmental regulation of libraries under the First Amendment).
72 See 21 U.S.C. § 396 (2012) (providing that FDA is not authorized to “limit or interfere with the authority of a health care practitioner to prescribe or administer any legally marketed device to a patient for any condition or disease within a legitimate health care practitioner-patient relationship”).
73 Draft Guidance, supra note 16, at 8.
74 See, e.g., Longhurst et al., supra note 34, at 1233.
75 See Evans, Barbara J. & Jarvik, Gail P., Impact of HIPAA's Minimum Necessary Standard on Genomic Data Sharing, Genetics in Med. (2017)Google ScholarPubMed (discussing HIPAA's broad exceptions to individual authorization and to the minimum necessary standard for data provided to health care providers for use in clinical care).
76 Office of the Nat'l Coordinator of Health Info. Tech., EHR Contracts Untangled 12 (2016); Office of the Nat'l Coordinator of Health Info. Tech., Report to Congress on Health Information Blocking 31-32 (Apr. 2015); Darius Tahir, Doctors Barred from Discussing Safety Glitches in US-funded Software, Politico (Sept. 11, 2015), http://www.politico.com/story/2015/09/doctors-barred-from-discussing-safety-glitches-in-us-funded-software-213553 [https://perma.cc/F34A-Y7H6].
77 See Kuang, supra note 60.
78 Thompson, supra note 19.
79 See Ground Truth, Techopedia, https://www.techopedia.com/definition/32514/ground-truth [https://perma.cc/8VZR-VBU6] (explaining that the term is borrowed from meteorology where it refers to the process of verifying remote sensing data by conducting on-site field tests).
80 Thompson, supra note 19.
81 Digital Health Innovation Action Plan, Food & Drug Admin. (2017), https://www.fda.gov/downloads/MedicalDevices/DigitalHealth/UCM568735.pdf; see Scott Gottlieb, FDA Announces New Steps to Empower Consumers and Advance Digital Healthcare, Food & Drug Admin. (July 27, 2017), https://blogs.fda.gov/fdavoice/index.php/2017/07/fda-announces-new-steps-to-empower-consumers-and-advance-digital-healthcare/ [https://perma.cc/H239-HTP6].
82 Digital Health Software Precertification (Pre-Cert) Program, Food & Drug Admin., https://www.fda.gov/MedicalDevices/DigitalHealth/UCM567265 (last updated Feb. 15, 2018) [https://perma.cc/8CSJ-65AM]; see also FDA Selects Participants for New Digital Health Software Precertification Pilot Program, Food & Drug Admin. (Sept. 26, 2017), https://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm577480.htm [https://perma.cc/H2U7-LEK2].
83 Digital Health Innovation Action Plan, supra note 81, at 2.
84 Id. at 5.
85 See generally id.
86 Id.
87 Id.
88 Id.
- 13
- Cited by