Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-04T21:33:22.893Z Has data issue: false hasContentIssue false

From paper to screen: regulatory and operational considerations for modernizing the informed consent process

Published online by Cambridge University Press:  28 March 2022

Nichelle L. Cobb
Affiliation:
Association for the Accreditation of Human Research Protection Programs, Inc. (AAHRPP), Washington, DC, USA
Dorothy F. Edwards
Affiliation:
University of Wisconsin-Madison School of Medicine and Public Health, Wisconsin Alzheimer’s Disease Research Center, Madison, WI, USA
Erin M. Chin
Affiliation:
University of Wisconsin-Madison School of Medicine and Public Health, Wisconsin Alzheimer’s Disease Research Center, Madison, WI, USA
James J. Lah
Affiliation:
Emory University, Goizueta Alzheimer’s Disease Research Center, Atlanta, GA, USA
Felicia C. Goldstein
Affiliation:
Emory University, Goizueta Alzheimer’s Disease Research Center, Atlanta, GA, USA
Cecilia M. Manzanares
Affiliation:
Emory University, Goizueta Alzheimer’s Disease Research Center, Atlanta, GA, USA
Christine M. Suver*
Affiliation:
Sage Bionetworks, Seattle, WA, USA
*
Author for correspondence: C.M. Suver, PhD, Sage Bionetworks, 2901 Third Ave, Suite 330, Seattle, WA 98121, USA. Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Electronic platforms provide an opportunity to improve the informed consent (IC) process by permitting elements shown to increase research participant understanding and satisfaction, such as graphics, self-pacing, meaningful engagement, and access to additional information on demand. However, including these elements can pose operational and regulatory challenges for study teams and institutional review boards (IRBs) responsible for the ethical conduct and oversight of research. We examined the experience of two study teams at Alzheimer’s Disease Research Centers who chose to move from a paper-based IC process to an electronic informed consent (eIC) process to highlight some of these complexities and explore how IRBs and study teams can navigate them. Here, we identify the key regulations that should be considered when developing and using an eIC process as well as some of the operational considerations eIC presents related to IRB review and how they can be addressed.

Type
Special Communications
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

Informed consent (IC) is a foundational component of ethical research. As articulated within the Belmont Report [1] and key US regulations governing human subjects research (i.e., the Common Rule [45 CFR 46.116]; Food and Drug Administration (FDA) regulations [21 CFR 50.20]), individuals must be given the opportunity, to the degree they are capable, to choose whether to participate in research. Legally effective IC is the process by which individuals (or their legally authorized representatives [LARs]) receive information about a research study in a language understandable to them, have sufficient opportunity to consider and discuss participation, and can decide whether to participate free from undue influence or coercion [2,3].

The need to make the IC process more meaningful and effective is well documented [Reference Ryan, Prictor, McLaughlin and Hill4,Reference Flory and Emanuel5]. The process traditionally involves reading a lengthy and complex form that few people find helpful or comprehensible [Reference Falagas, Korbila, Giannopoulou, Kondilis and Peppas6,Reference Lorell, Mikita, Anderson, Hallinan and Forrest7]. Challenges with IC prompted the US Department of Health and Human Services (HHS) to highlight the need for IC improvements in an Advanced Notice of Proposed Rulemaking, which noted the a) excessive length and technical language of IC documents and b) perception that IC forms tend to be more of a legal document meant to protect institutions rather than a decision-making resource for potential research participants [8]. This drive to improve IC resulted in new expectations within the revised 2018 Common Rule that explicitly requires the IC process to be organized and presented to facilitate participant comprehension.

Web-based technologies can enhance IC and provide an experience more similar to how most people access information in their everyday lives [Reference Grady, Cummings, Rowbotham, McConnell, Ashley and Kang9]. A growing number of electronic informed consent (eIC) platforms are available [1012] that include some or all of the following:

  • animation, videos, other visual imagery;

  • avatars;

  • questions to provide feedback on participant comprehension;

  • popups and hyperlinks leading to supplemental information; and

  • customized experience through the use of branching logic.

Incorporating features such as these in the eIC process provide attractive alternatives to address comprehension challenges, reach a wider range of participants, and engage those who might be excluded from research participation. For example, Sage Bionetworks pioneered a self-guided eIC tool that can be administered via a smartphone [Reference Doerr, Suver and Wilbanks13]. This is a convenient and effective way to engage more people and get them interested in participating in research because >85% of the US population owns a smartphone, with few differences seen across race, age, or socioeconomic status [14]. Further, the COVID-19 pandemic has underscored the need for research teams to conduct the IC process remotely to avoid in-person contact and potential exposure [Reference Rothwell, Brassil and Barton-Baxter15]. We expect that the trend toward eIC will continue post-pandemic.

This paper explores how IRBs and study teams can navigate applicable regulatory requirements when developing eIC processes and draws on the experience of Alzheimer’s Disease Research Center (ADRC) study teams who transitioned from a paper-based to an eIC approach that included a web-based tool. ADRC teams at Emory University and University of Wisconsin–Madison collaborated with Sage Bionetworks (together “we” or the “team”) to reimagine the IC process for nontherapeutic studies that recruit a) individuals with potential memory disorders and decreased decision-making capacity and b) healthy controls. We produced a participant-centric, interactive eIC experience that incorporates IC best practices such as using simplified language, graphics, comprehension questions with corrective feedback [Reference Taub, Kline and Baker16Reference Festinger, Dugosh, Marlowe and Clements18], and tiered information (i.e., allowing participants to access supplemental information about specific concepts). This eIC approach addressed some of the barriers to an effective IC process that ADRC study coordinators previously identified, including making it easier to customize the order in which information is presented to research participants [Reference Suver, Hamann and Chin19]. The eIC experience was piloted with study coordinators and participants who provided input on which features of eIC were most relevant and meaningful to them (manuscript in preparation). Although our eIC development considered the needs of individuals with potential memory deficit and some cognitive impairments, we think the approaches we adopted are broadly applicable.

eIC Regulations Considerations

Presenting information to potential participants or their LARs (henceforth collectively ‘“participants”) in an electronic format meets the definition of eIC in the 2016 Office for Human Research Protections (OHRP) and FDA joint guidance (henceforth “OHRP/FDA guidance”) [20], which outline an expansive view of eIC, stating it encompasses: “the use of electronic systems and processes that may employ multiple electronic media, including text, graphics, audio, video, podcasts, passive and interactive web sites, biological recognition devices, and card readers, to convey information related to the study and to obtain and document informed consent.”

This definition emphasizes that eIC is more than obtaining an electronic signature: the learning experience is an integral part of the IC process. As part of our development process, it was crucial to identify which regulations applied to the ADRC studies to ensure compliance and because of important differences between the major US federal regulations that apply to human research. For example, the Common Rule and FDA regulations governing human subjects research differ regarding the requirements for electronic signature. The Common Rule simply communicates that consent forms may be signed electronically (45 CFR 46.117)]. In comparison, FDA regulations (21 CFR 11 aka “Part 11”) include detailed requirements that must be met to ensure electronic signatures are “trustworthy, reliable, and generally equivalent to a handwritten signature executed on paper” [21]. Part 11 includes requirements for any system capturing an electronic signature to be secure with restricted access, include a method for validation, and produce an audit trail. The ADRC studies for which the eIC tool was developed were not subject to FDA regulations and thus were not required to comply with Part 11. Nonetheless, we highlight areas where compliance with FDA regulations differs from the Common Rule and those we determined applied to the ADRC studies (Table 1).

Table 1. Key regulations that should be considered when developing an electronic informed consent (eIC) process

ARDC = Alzheimer’s Disease Research Center.

Another regulation to consider is whether the HIPAA Privacy Rule applies and, if so, whether a separate authorization to use protected health information will be required. Additional examples of regulations that may apply to some research and affect the eIC process include the Children’s Online Privacy Protection Act (COPPA) [22] and Family Education Rights and Privacy Act (FERPA) [Reference Gilbert23], which may influence from whom permission must be obtained and documented.

eIC Tool Features

This project presented the team the opportunity not only to reformat the current paper IC version into an eIC but also to add key features to create a participant-centered IC experience by:

  • Simplifying the language and presenting information in modules.

  • Representing themes with iconography.

  • Allowing participants to choose the order they view the modules.

  • Providing optional (tiered) information about specific concepts.

  • Concluding each module with questions to reinforce learning and allow self-assessment of comprehension of the material.

The eIC we developed employed best practices for the presentation of information in an online format as implemented in Apple ResearchKit, the National Institutes of Health All of Us Research Program [Reference Doerr, Moore and Barone24], and the HHS and the US General Services Administration’s Research-Based Web Design & Usability Guidelines [25]. Figure 1 illustrates the navigation through our eIC.

Fig. 1. Electronic informed consent flow and features. IRB, institutional review board.

In accordance with the revised Common Rule, our eIC tool starts with a concise summary of the study, followed by study details grouped into a table of contents presented in a 3 × 3 grid. Each “tile” represents a module and includes a title and icon that conveys the section’s content. The modules represent essential sections of an IC, such as risks and benefits; privacy; costs and compensation. The reader must click on all modules, moving through the content and answering questions to assess understanding before being allowed to access the signature section. In each module, optional information is available about key concepts via hyperlinks in popup windows. After reviewing each module’s information and answering the comprehension questions, a summary of main concepts reinforced by the iconography used throughout the eIC is shown so readers can verify their understanding and consolidate their decision on whether to join the study.

Because ADRC researchers enroll individuals with memory deficits, an important feature of the eIC tool is the addition of multiple-choice questions after each module rather than at the end of the IC interaction; this deliberate design feature assesses comprehension of each module’s concepts rather than overall recall. It is a well-accepted approach in AD research used to address comprehension of individual elements in competency assessment [Reference Jefferson, Lambe and Moser26] and can function as a “cognitive prosthetic” to help potential research participants, especially those with mild cognitive impairments, stay focused on the IC session [Reference Rubright, Sankar and Casarett27]. When participants select a response, the tool provides feedback on whether the response was correct and, if not, what the correct response was and why. Questions were designed to be engaging and reinforce learning. Asking questions about specific study concepts and providing corrective feedback is an effective way to determine if participants comprehend the information [Reference Festinger, Dugosh, Croft, Arabia and Marlowe28]. Answers can help identify misunderstandings and lead to conversations to address knowledge gaps, which is especially important when working with individuals with potential memory and cognitive deficits, but also a best practice for IC generally.

eIC Regulatory Oversight and Logistical Challenges

Federal regulations do not directly describe the specific information an IRB must obtain to review an eIC process. One reason for this lack of specificity is that IRBs must make their determinations independent of format when they review the IC process and any information provided to participants. A key IRB responsibility is to ensure participants receive any information, including the basic elements of consent and any relevant additional details, in “understandable language.” The 2018 Common Rule obligates IRBs to ensure the IC process facilitates comprehension and includes information that a reasonable person would want to have to make an informed decision about whether to participate. Additionally, federal regulations require IRBs to maintain documentation of their reviews, including any study changes or IC revisions.

During the development of the eIC tool, ADRC researchers, in conjunction with their local IRBs, worked through regulatory and operational issues that the review of an eIC process presented. Not all IRBs have the bandwidth for such close collaboration. Consequently, Table 2 provides practical recommendations for IRBs and research teams considering eIC with interactive features, focusing on four areas:

  1. 1) Presenting the eIC experience to the IRB and tracking changes.

  2. 2) Retaining records

  3. 3) Documenting the IC

  4. 4) Providing a copy of the IC

Table 2. Key considerations and recommended practices for electronic informed consent (eIC) approaches

IRBs and study teams should work together to develop a plan that addresses each of these areas. For example, research teams should discuss with the IRB what constitutes the “copy” of the IC when it includes interactive elements and how the copy should be provided (e.g., on an electronic storage device, via email, or another method). We recommend research teams provide participants with a summary that highlights key information about the research study and which participants can either print in tangible form or save electronically for easy reference. This approach mirrors the concise and focused summary requirement in the Common Rule (45 CFR 46.116(a)(5)) and is similar to the short form approach often used to document IC when the contents of the IC are presented orally to a participant. When using the short form, the study team must create and provide a written summary of what is to be said to the prospective participant and include an acknowledgment that all applicable elements of IC were presented. For an interactive eIC experience, this “written summary” of key information with the link to the eIC process could be given to participants in paper format for the participant record.

One consideration not driven by federal regulations is the common practice of IC form stamping. IRBs often add approval stamps on IC forms to assist study teams with tracking approved document versions. Stamps frequently identify the IRB, IC version, approval date, and study expiration date. Some platforms (e.g., REDCap) can generate a static capture of IC document screens and convert them to PDF files. Many IRB systems can stamp PDFs. IRBs and study teams need to be flexible and develop procedures, such as the REDCap alternative method, to identifying IRB-approved documents. The eIC approach with interactive features that our team developed, for example, cannot be stamped in the same way as traditional IC forms, but text within our online forms could show the version and IRB approval date, essentially recreating the information in an IRB stamp. Additionally, study teams can describe in standard operating procedures their process for a) tracking changes and versions of interactive eIC forms and b) presenting the correct versions to participants. If IRBs do not require written representation of IC information to be stamped, this challenge is completely avoided.

Core Versus Optional Information

The revised Common Rule placed IC front and center, reemphasizing the need for study teams and IRBs to carefully consider IC content and presentation. An interactive eIC tool can better meet some of the expectations outlined in the Common Rule for IC compared with a paper-based or eIC approach alone because the interactive features give participants more agency in determining the level of detail and order in which to consume the information. An interactive eIC is responsive to two key requirements in the Common Rule:

  1. 1) Prospective participants must be given information that a reasonable person would want to evaluate to make an informed decision about participating and an opportunity to discuss that information.

  2. 2) The information given should be in a language understandable to participants.

Although being able to tailor the level of detail and complexity of information is not unique to an eIC approach, eIC facilitates it through hyperlinks and pop-ups.

The Common Rule uses the reasonable person standard to guide IC content, but does not define reasonable person. Operationalizing this standard is challenging because individuals differ in what information they want about research [Reference Odwazny and Berkman29]. The challenge of defining what a reasonable person would want to know is compounded by the fact that participants differ in the information they consider to be important about research studies. For example, one study showed most potential participants wanted to know information about the return of research results (91%) [Reference Kirkby, Calvert, Draper, Keeley and Wilson30], yet federal regulations do not prioritize disclosure of clinically relevant results either as a concept that should be included in the concise and focused summary or even as a basic element of IC. Additionally, the Common Rule requires that for a reasonable person to make a decision, an IC must be presented in language understandable to them—but ensuring all IC language is understandable to every possible participant is a difficult standard to meet [Reference Ethicist31]. As the FDA notes, “understandable” means the information is presented to potential participants “in a language and at a level the participants can comprehend (including an explanation of scientific and medical terms)” [32].

The ADRC eIC experience was responsive to the reasonable person standard and providing information in language understandable in two key ways. Specifically, the ADRC eIC incorporated:

  1. 1. A tiered-information approach that ensured individuals received core information about the research and allowed them to access supplemental information as desired.

  2. 2. A modular approach, iconography, simplified language (middle-school reading level), definitions, and glossary of terms such that individuals of varied educational levels could learn about the study.

Further, grouping information into logical modules that met the Common Rule requirement for IC organization presentation.

A decision our team made in collaboration with the local IRBs was determining what information could be tiered, resulting in some information being presented to all (i.e., core information), while supplementary information would be accessible on-demand via a pop-up window if the reader clicked the hyperlink (i.e., optional information). Core information included required basic and applicable additional elements of IC per the Common Rule. A simple example of the tiered approach appears in the IC section that describes study visits: “We will ask you to return for follow-up visits every one or two years for as long as you can.” This statement meets the basic element of IC to inform individuals of the expected duration of their study participation. One could click on the phrase “as long as you can” to read details about long-term participation, and a pop-up window appeared that provided details (i.e., the possibility that if participants can no longer attend visits in person, they may be able to participate via telephone and their information would be collected from a surrogate if they became too ill to continue via telephone). The IRB required the estimation of study duration (“as long as you can”) to appear within the body of the IC materials as a required basic element of IC, while the detail about how participation might change from in-person to remote visits was viewed as of interest to some individuals but not essential to decision-making. For the tiered approach, each module starts with core information and includes links to pop-up screens for optional information (Figure 2); this approach allows individuals to tailor the IC process to the level of information they need and how they prefer to access it.

Fig. 2. Electronic informed consent tiered information feature

Other Considerations

Several additional factors can affect the development of eIC, such as:

  • State laws: Some states have restrictions on what platforms may be used to capture electronic signatures, which can limit what platforms or approaches can be considered for eIC, or dictate specific requirements for IC language and font size (e.g., California’s Protection of Human Subjects in Medical Experimentation Act that requires the “Experimental Participants Bill of Rights” be provided to all participants in medical research and be in a specific format).

  • Single IRB: As of January 20, 2020, most federally supported, multi-site research must be overseen by a single IRB (45 CFR 46.114(b)). If a single IRB will be used for the study, any local context requirements related to eIC will need to be communicated to the reviewing IRB [Reference Cobb, Witte and Cervone33]. The process for including site-specific language within an eIC should be considered and discussed with the single IRB.

  • Posting Clinical Trial Consent Forms: The revised Common Rule includes a new provision that a copy of the final version of the consent form is posted on a publicly available federal website for each clinical trial conducted or supported by a Common Rule department or agency (45 CFR 56.116(h)).

  • Non-English Speakers: The Common Rule requirement for research teams to provide information to participants in language understandable to them does not just speak to readability but also whether what language a participant may comprehend. Translation and interpretation needs should be factored into the development of any IC process, whether paper- or electronic-based.

  • Assent: For studies involving children that use an eIC process, the research team will need to consider the development of age-appropriate materials if an assent process is required. Use of any IC materials aimed at parents or guardians may only be suitable for older children.

Research teams developing eIC processes should consult with their IRB early on to ensure the team addresses applicable regulations, understands IRB review requirements, and factors in tracking, versioning, and maintaining materials involved in an eIC process. To facilitate collaboration, we developed questions to help research teams and IRBs anticipate and address regulatory and other issues that innovative eIC processes can present (Table 3).

Table 3. Key operational and regulatory questions to ask when developing an electronic informed consent (eIC) approach

Conclusion

Interviews with study coordinators from the two ADRC teams about their paper-based IC process identified key challenges: difficulties maintaining participant attention, explaining complex procedures, and modifying the IC process to support the needs of individual participants [Reference Suver, Hamann and Chin19]. Often, such challenges are not specifically considered as part of IRB review. IRBs traditionally have focused more on forms than processes, but the process can be as or more important in promoting robust IC [Reference Klitzman34]. Moreover, IRBs must ensure that eIC approaches do not simply reproduce the content of an IC form in an electronic format but instead improve the IC process [Reference Grady, Cummings, Rowbotham, McConnell, Ashley and Kang9]. An interactive eIC experience inherently shifts the focus from reading a form to the learning process and should prompt research teams to describe in more detail, and IRBs to consider in more depth, how participants engage in the IC process and access information, how information is organized and presented, and how understanding is gauged. Thinking more deeply about the IC process could lead to wider adoption of interactive eIC experiences that enhance participant engagement and understanding and reduce potential selection bias [Reference Grady, Cummings, Rowbotham, McConnell, Ashley and Kang9].

Use of eIC, including more platforms with interactive features, was increasing but accelerated during the COVID-19 pandemic [Reference Lawrence, Dunkel and McEver11,Reference Rothwell, Brassil and Barton-Baxter15], highlighting how well IRBs and study teams can work together to facilitate the transition from paper-based, in-person IC to a remote eIC process. Increased use of eIC likely will become a mainstay of research and provides an opportunity to better meet the expectations for IC outlined in the revised Common Rule, thus substantively improving participant experience by leveraging interactive features that enable the IC process to be tailored to individual participants’ needs.

Acknowledgments

The author thanks Ellen Kuwana, MS, and Tori Allen, MSc for expert editing assistance.

This work was supported by the National Institute of Health (NIH) (P50AG033514, P50AG025688, P30-AG062715, P30AG066511) and the Wisconsin Alumni Research Foundation (WARF). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH or WARF. The funders had no role in the decision to publish or the preparation of the manuscript.

Disclosures

None of the authors has conflicts of interest related to this work.

References

The Belmont Report. Ethical principles and guidelines for the protection of human subjects of research. Office for human research protections. https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html Google Scholar
Ryan, RE, Prictor, MJ, McLaughlin, KJ, Hill, SJ. Audio-visual presentation of information for informed consent for participation in clinical trials. The Cochrane Database of Systematic Reviews 2008; 1: CD003717.Google Scholar
Flory, J, Emanuel, E. Interventions to improve research participants’ understanding in informed consent for research: a systematic review. JAMA 2004; 292(13): 15931601.CrossRefGoogle ScholarPubMed
Falagas, ME, Korbila, IP, Giannopoulou, KP, Kondilis, BK, Peppas, G. Informed consent: how much and what do patients understand? American Journal of Surgery 2009; 198(3): 420435.CrossRefGoogle ScholarPubMed
Lorell, BH, Mikita, JS, Anderson, A, Hallinan, ZP, Forrest, A. Informed consent in clinical research: Consensus recommendations for reform identified by an expert interview panel. Clinical Trials 2015; 12(6): 692695.Google ScholarPubMed
Human Subjects Research Protections. Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators. https://www.federalregister.gov/documents/2011/07/26/2011-18792/human-subjects-research-protections-enhancing-protections-for-research-subjects-and-reducing-burden Google Scholar
Grady, C, Cummings, SR, Rowbotham, MC, McConnell, MV, Ashley, EA, Kang, G. Informed Consent. New England Journal of Medicine 2017; 376(9): 856867.Google ScholarPubMed
Mytrus rolls out informed consent iPad app. CenterWatch. https://www.centerwatch.com/articles/21384 Google Scholar
Lawrence, CE, Dunkel, L, McEver, M, et al. A REDCap-based model for electronic consent (eConsent): Moving toward a more personalized consent. Journal of Clinical and Translational Science 2020; 4(4): 345353.10.1017/cts.2020.30CrossRefGoogle Scholar
Mobile App Quick Overview for Research. U.S. Food and Drug Administration. https://www.fda.gov/media/138258/download Google Scholar
Doerr, M, Suver, C, Wilbanks, J. Developing a Transparent, Participant-Navigated Electronic Informed Consent for Mobile-Mediated Research. SSRN Electronic Journal 2016.10.2139/ssrn.2769129CrossRefGoogle Scholar
Mobile Fact Sheet. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/fact-sheet/mobile/.Google Scholar
Rothwell, E, Brassil, D, Barton-Baxter, M, et al. Informed consent: Old and new challenges in the context of the COVID-19 pandemic. Journal of Clinical and Translational Science 2021; 5(1): e105.Google ScholarPubMed
Taub, HA, Kline, GE, Baker, MT. The elderly and informed consent: effects of vocabulary level and corrected feedback. Experimental Aging Research 1981; 7(2): 137146.10.1080/03610738108259796CrossRefGoogle ScholarPubMed
Taub, HA, Baker, MT. The effect of repeated testing upon comprehension of informed consent materials by elderly volunteers. Experimental Aging Research 1983; 9(3): 135138.Google ScholarPubMed
Festinger, DS, Dugosh, KL, Marlowe, DB, Clements, N. Achieving new levels of recall in consent to research by combining remedial and motivational techniques. Journal of Medical Ethics 2014; 40(4): 264268.CrossRefGoogle ScholarPubMed
Suver, CM, Hamann, JK, Chin, EM, et al. Informed consent in two alzheimer’s disease research centers: insights from research coordinators. AJOB Empirical Bioethics 2020; 11(2): 114124.CrossRefGoogle ScholarPubMed
Use of Electronic Informed Consent in Clinical Investigations – Questions and Answers. U.S. Food and Drug Administration. https://www.fda.gov/media/116850/download. Google Scholar
21 CFR 11 - Code of Federal Regulations Title 21 U.S. Food and Drug Administration. https://www.ecfr.gov/current/title-21/chapter-I/subchapter-A/part-11?toc=1 Google Scholar
Gilbert, T. Family Educational Rights and Privacy Act (FERPA). J Empir Res Hum Res Ethics. 2007; 2: 101.Google Scholar
Doerr, M, Moore, S, Barone, V, et al. Assessment of the all of us research program’s informed consent process. AJOB Empirical Bioethics 2021; 12(2): 7283.CrossRefGoogle ScholarPubMed
Research-based web design & usability guidelines. U.S. Dept. of Health and Human Services : U.S. General Services Administration. https://www.usability.gov/sites/default/files/documents/guidelines_book.pdf.Google Scholar
Jefferson, AL, Lambe, S, Moser, DJ, et al. Decisional capacity for research participation in individuals with mild cognitive impairment. Journal of American Geriatrics Soceity 2008; 56: 12361243.CrossRefGoogle ScholarPubMed
Rubright, J, Sankar, P, Casarett, DJ, et al. A memory and organizational aid improves Alzheimer disease research consent capacity: results of a randomized, controlled trial. American Journal of Geriatric Psychiatry 2010; 18: 11241132.10.1097/JGP.0b013e3181dd1c3bCrossRefGoogle ScholarPubMed
Festinger, DS, Dugosh, KL, Croft, JR, Arabia, PL, Marlowe, DB. Corrected feedback: a procedure to enhance recall of informed consent to research among substance abusing offenders. Ethics & behavior 2010; 20(5): 387399.Google ScholarPubMed
Odwazny, LM, Berkman, BE. The “Reasonable Person” standard for research informed consent. The American Journal of Bioethics: AJOB 2017; 17(7): 4951.10.1080/15265161.2017.1328540CrossRefGoogle ScholarPubMed
Kirkby, HM, Calvert, M, Draper, H, Keeley, T, Wilson, S. What potential research participants want to know about research: a systematic review. BMJ Open 2012; 2(3): e000509.10.1136/bmjopen-2011-000509CrossRefGoogle ScholarPubMed
Ethicist, P. Practical ethicist: what is “Understandable” language? Journal of Empirical Research on Human Research Ethics 2018; 13(4): 368370.CrossRefGoogle Scholar
Office of the Commissioner. Guide to informed consent. U.S. Food and Drug Administration. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/informed-consent.Google Scholar
Cobb, N, Witte, E, Cervone, M, et al. The SMART IRB platform: A national resource for IRB review for multisite studies. Journal of Clinical and Translational Science 2019; 3(4): 129139.CrossRefGoogle ScholarPubMed
Klitzman, RL. How IRBs view and make decisions about social risks. Journal of Empirical Research on Human Research Ethics: JERHRE 2013; 8(3): 5865.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Key regulations that should be considered when developing an electronic informed consent (eIC) process

Figure 1

Fig. 1. Electronic informed consent flow and features. IRB, institutional review board.

Figure 2

Table 2. Key considerations and recommended practices for electronic informed consent (eIC) approaches

Figure 3

Fig. 2. Electronic informed consent tiered information feature

Figure 4

Table 3. Key operational and regulatory questions to ask when developing an electronic informed consent (eIC) approach