Introduction
Centers of Biomedical Research Excellence (COBRE) were established in 2000 as one mechanism for building research capacity in states that have had historically low levels of National Institutes of Health research funding. COBREs support basic, clinical, and translational research as well as faculty development. COBREs are “thematic, multidisciplinary centers that augment and strengthen institutional biomedical research capacity” [1] through three, five-year sequential phases that seek to expand the cadre of biomedical research scientists by funding research projects that are conducted by junior-level investigators under the guidance and direction of more seasoned research mentors. Because a primary focus of these Centers is the development of junior-level investigators into competent and successful research scientists, evaluation of their skills, mentoring experiences, and usefulness of COBRE services is paramount to the transition of the Center to a self-sustaining, collaborative, multidisciplinary research environment.
Unfortunately, seemingly very little formal evaluation of COBREs has been published, and ones that have been are dated. For example, in 2008 a report of the process evaluation conducted with the initial 18 COBREs that were awarded in 2000 was published [2]. This report highlighted successes of the COBRE program overall and provided suggestions for improvement and for entities seeking to establish a Center [2]. Since this report, no other formal evaluation of all of the COBRE programs has been published by NIGMS. Seemingly, when evaluation of COBREs does occur, and these are done on an individual COBRE basis, they are usually tied to metrics such as the gender and race of junior investigators and the number of junior investigator publications and grant awards [Reference von Bartheld, Houmanfar and Candido3–Reference Traw, Herzog and Ziegler6].
Although these metrics are important in measuring whether the COBRE is successful in expanding the pool of biomedical researchers and contributing to their professional development, they do little to inform the COBRE leadership about the training and mentoring activities and services provided that are specific to that COBRE. Recognizing this shortcoming, the principal investigators of the CardioPulmonary Vascular Biology (CPVB) COBRE of Ocean State Research Institute at the Providence VA Medical Center, Providence Rhode Island, sought the services of an external evaluator to develop an evaluation plan for their COBRE. A two-phased evaluation plan was developed; the formative portion focused on the processes of the COBRE, such as the quality of the overall program, satisfaction with training, mentoring, and services offered, the mechanisms for communication, and effectiveness of the collaboration between junior investigators and their mentors. A summative evaluation centered on the effectiveness of the COBRE for attaining its specified goals. Herein, we present the formative portion of the overall evaluation plan and the mentoring and training experience.
Methods
Participants
Junior investigators
All eight of the current junior investigators completing the questionnaire were primarily PhD-trained assistant professors of research (n = 6). Five of these six had postdoctoral experiences. One of the two remaining junior investigators was an MD/PhD and the other an MD, both of whom were assistant professors in clinical departments who had research experiences during their clinical training. The eight junior investigators were between 7 and 16 years post-terminal degree; five were women, four were white, non-Hispanic, one was Latinx, and two were Asian; six have been investigators with the COBRE since 2018 and two since 2019.
Mentors
Six of a possible thirteen mentors (46.2%) completed the online evaluation between November 19, 2019, and December 3, 2019, and these mentors were primarily women (67%; n = 4). Each of the mentors responding represented a different mentor–junior investigator dyad, four of which were same gendered (two men and two women) and two of which were mixed gender. Mentors of the CPVB COBRE are recognized as established scholars within their fields of study, have productive and funded laboratories, and have extensive mentoring experience within their universities and the societies with which they are affiliated. Half of the mentors had completed the National Research Mentoring Network training sponsored by a Clinical and Translational Research Award from the NIGMS and had mentored an average of almost three junior faculty over the past five years. The Executive Committee of the CPVB COBRE was responsible for pairing junior investigators with their mentors. These pairings were primarily based on shared areas of research but also included matching on the mentor’s ability to provide advice and support to the junior investigator and their mentoring style.
Instruments
An integral part of this portion of the evaluation plan was the creation of a questionnaire for junior investigators that addressed four domains: 1) relationship with their mentor, 2) research self-efficacy, 3) administrative and specialty core values, and 4) satisfaction with events and operations of the COBRE. The two coprincipal investigators, program manager, and evaluator met to develop the 34 items comprising the junior investigator instrument. Most items asked participants to respond to the items using five-point, Likert-type scales, with higher scores reflecting more agreement, better quality, more frequency, or greater satisfaction. Sample items included My mentor provided useful critiques (strengths and weaknesses) of my project; I am able to build scientific collaborations; I am able to articulate practical applications for my research (e.g., “elevator speech” to potential collaborators, donors, or venture capitalists); The Admin Core responded to my requests in a timely manner; and The system for requesting services from the CORE LAB is efficient.
A 20-item questionnaire was also developed for mentors to complete in the same manner. This tool inquired about the mentors’ relationship with their mentee, the quality of their mentoring, their mentee’s progress, and satisfaction with the COBRE. Most items asked participants to respond to the items using five-point, Likert-type scales, with higher scores reflecting more agreement, better quality, more frequency, or greater satisfaction. Sample items included I provided my mentee with scholarly opportunities (e.g., publications, grant writing, grant or manuscript reviewing); How effective is your mentee as a research mentor for others? and Resources for the CPVB COBRE (e.g., funding, staffing) are adequate.
Results
Junior investigator results will be presented first. In instances where mentors completed similar items, these results will be presented in parallel.
Participants were asked to complete an item about how often and in what way they met with their mentors each month. Most junior investigators indicated they had two mentors, with one reporting having three and one having one. Most often, these participants met with their primary mentor in person, for a little less than five hours per month (range: 1–10 hours; $ \overline x$ = 4.75), via email about three hours per month (range: .5–10 hours; $\overline x$ = 2.93), and on the phone for less than an hour each month (range: 0.05–2 hours; $\overline x$ = 1.01). Secondary mentors were most often met with via email for an average of 1.5 hours per month. Interestingly, mentors reported meeting with their mentee (junior investigator) for an average of 3 hours per month (range: .5–8 hours; $\overline x$ = 3.08), via email a little over one hour per month (range: 0–4 hours; $\overline x$ = 1.28), and over the phone for less than a half hour per month (range: 0–1 hour; $\overline x$ = 0.40).
Next, participants were asked to respond to three items concerning the extent to which their primary mentor provided certain activities. Mentors were asked these same items but with the extent to which they conducted the activity of interest. For example, the junior investigators were asked to respond to the stem My mentor provided useful critiques (strengths and weaknesses) of my project while mentors were asked to self-report on the same item, so their statement read I provided my mentee with useful strengths and weaknesses of his/her project. Each of the items and average response is presented in the table below with mentor items italicized (Table 1).
One additional item answered by the mentors was My mentee is able to conduct research independently. Mentors were quite positive about their mentees’ abilities, with an average response of 4.33 (SD = 0.75).
Two items asked the junior investigators to rate the quality of the mentoring they had received and the mentors to self-evaluate their mentoring. On average, the junior investigators felt that their mentoring was Good ( $\overline x$ = 4.29), with a few reporting it as Very Good and one reporting it as Poor. Mentors rated their mentoring quite similarly with an overall average rating between Good and Very Good ( $\overline x$ = 4.33). None of the mentors rated the quality of their mentoring as less than Good. Another item asked participants to relate the extent to which they felt their primary mentor/their mentee was meeting their expectations. Junior investigators felt their expectations were Usually or Completely being met over 96% of the time ( $\overline x$ = 4.00) and mentors felt their mentees were meeting their expectations Usually or Completely all of the time ( $\overline x$ = 4.33).
The next set of items junior investigators were asked to complete addressed aspects of their research training that were a focus of topics delivered by the CPVB COBRE. Here, participants were asked to indicate how prepared they felt when they were to conduct the activity independently. Participants were also provided with options of Unable to Evaluate if they had not participated in the training for that item or Like to Learn More if it was a topic in which they wanted more training. Results for these items are presented in Table 2.
Interestingly, none of these participants indicated that any of the items were ones in which they wanted to learn more. One participant did indicate he/she was unable to evaluate the item in four instances; however, it was not the same person for each item. The items they felt unable to evaluate were I am able to expand my research repertoire to areas outside my primary area of interest, I am making adequate progress as an academician (e.g., leadership with professional organizations and/or within my department), I am able to articulate practical applications for my research (e.g., “elevator speech” to potential collaborators, donors, or venture capitalists), and I am able to develop collaborations with scholars and professionals from other disciplines.
Mentors were asked to rate their mentee’s progress on various tasks associated with conducting research. The items and descriptive statistics are presented in Table 3.
Mentors were also asked to indicate how effective they thought their mentee was as a research mentor for others, and mentors indicated they felt their mentees were Effective (range: 3.00–5.00; $\overline x$ = 4.00, SD = 0.58).
Finally, participants were asked about their experiences with the Administrative Core of the COBRE. The Administrative Core provides the management, fiscal, and scientific aspects of the Center, as well as managing its scientific direction and career development opportunities. It is with these interactions in mind that junior investigators were then asked how valuable they found some of the services and training opportunities offered by the Administrative Core. All of the junior investigators who utilized the Mock Study Sections found them extremely valuable (n = 5). Similarly, of the seven who participated, four indicated they found the visiting professors, pilot project study sections, Administrative Core services, and Cell Isolation and Organ Function (Lab) Core services to be extremely valuable (57.1%; n = 7). None of the participants indicated any of the services or training opportunities were Not at all Valuable.
Junior investigators were then asked to indicate how strongly they agreed or disagreed with a series of questions about the Administrative Core and Lab Core. For the Lab Core, participants were also given an option of indicating that they had not used the Lab Core for that service; however, all of the junior investigators used the services of the Lab Core and were able to respond to these items using the scale provided. Table 4 presents the average responses for each of these items.
Mentors were asked several items about the CPVB COBRE specifically. These items and responses are presented in Table 5.
One additional item asked the mentors to rate the overall quality of the CPVB COBRE using a seven-point Likert-type scale with responses ranging from Very Poor (1) to Exceptional (7). On average, these mentors felt the CPVB COBRE was well above average (range: 5.00–7.00; $\overline x$ = 6.17, SD = 0.69).
Conclusions
As noted previously, very little formal evaluation of the processes and impacts of the COBREs has been shared in the research community. This project attempts to begin to address this shortcoming by providing the results of a formative evaluation of the CPVB COBRE.
For these junior investigators, the activities provided by their mentors were extremely positive. The scientific, academic, and professional development opportunities afforded to these investigators by their mentors are exemplary, and their collaborations are viewed as integral to the junior investigators’ success. Mentors were similarly positive about their mentees. Overwhelmingly, these mentors felt the junior investigators were making better than adequate progress toward becoming independent and successful researchers. Of note, however, is the discrepancy in perspective regarding the extent to which mentors provided junior investigators with scholarly opportunities, such as publications and grant writing. Junior investigators reported feeling mentors could do more and mentors felt they were providing those opportunities well. Because of the importance attached to this metric, not only for the success of the COBRE, but for the scholars, this may be an avenue the CPVB COBRE needs to explore in conversations with individual mentors. For example, mentors can recommend mentees as reviewers for journal articles, speaking engagements, or grant reviewing assignments.
Of great interest to the leadership of this COBRE was the training preparedness of the junior investigators. Some opportunities exist for the COBRE to meet the training needs of these junior investigators by specifically providing training on managing a laboratory or mentoring others. Interestingly, the item in which the average junior investigator’s response was lowest focused on whether he/she was making adequate progress as an academician, whereas for mentors, they felt their mentee could most improve by generating more publications of their research. This may be a circumstance where the COBRE can offer guidance to the mentees as to what adequate progress as an academician entails. Again, the use of standard benchmarks for the given institution, personal experiences of mentors, and input from those outside the CPVB may be an avenue of exploration for the COBRE.
When asked an open-ended item for improving the Administrative Core, the only responses centered around ways in which to advance their careers as academics. For example, COBRE leadership (including members of its External Advisory Committee) could take an active role linking the investigators with a means to speak and serve on committees within an appropriate professional society. Similarly, a participant stated, “the Administrative Core may recognize additional publications or a higher impact journal publication is required, but the Core is less active at helping to create the reviewer and editorial feedback/contact that may lead to the editorial support at higher impact journals.” To this end, the CPVB COBRE may be able to offer publication review as a service in addition to its mock study sections and other activities.
Finally, it is worth noting that these junior investigators felt their abilities to collaborate with scholars and professionals from other disciplines were exceptionally strong. This bodes well for the sustainability of the CPVB COBRE as a collaborative, multidisciplinary research environment.
It is worth noting that the results of this evaluation are quite limited in that they represent the opinions of eight junior investigators and six mentors but do serve as an impetus for discussion as the leaders of this COBRE and others may use the results to inform their future service delivery and areas in need of improvement. Of particular merit, may be that as Bonilha and colleagues noted [Reference Bonilha7], strong mentoring programs significantly improve faculty satisfaction and metrics associated with career development that are directly associated with the functions and goals of COBREs. The formalization of evaluation presented here can also serve as a model for other Centers of Biomedical Research and Excellence, thereby starting to bridge gaps in program evaluation and improvement.
Acknowledgments
This research was supported by the National Institute of General Medical Sciences of the National Institutes of Health under award number P20GM103652 (Rounds, Harrington; Multi-PIs). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Disclosures
The authors have no conflicts of interest to declare.