Skip to main content
  • Research article
  • Open access
  • Published:

Graduate medical education scholarly activities initiatives: a systematic review and meta-analysis

Abstract

Background

According to the Accreditation Council for Graduate Medical Education residents “should participate in scholarly activity.” The development of a sustainable, successful resident scholarship program is a difficult task faced by graduate medical education leadership.

Methods

A medical librarian conducted a systematic literature search for English language articles published on scholarly activities initiatives in Graduate Medical Education (GME) between January 2003 and March 31 2017. Inclusion criteria included implementing a graduate medical education research curriculum or initiative designed to enhance intern, resident, or fellow scholarly activities using a control or comparison group. We defined major outcomes as increases in publications or presentations. Random effects meta-analysis was used to compare the rate of publications before and after implementation of curriculum or initiative.

Results

We identified 32 relevant articles. Twenty-nine (91%) reported on resident publications, with 35% (10/29) reporting statistically significant increases. Fifteen articles (47%) reported on regional, national, or international presentations, with only 13% (2/15) reporting a statistically significant increase in productivity. Nineteen studies were eligible for inclusion in the meta-analysis; for these studies, the post-initiative publication rate was estimated to be 2.6 times the pre-intervention rate (95% CI: 1.6 to 4.3; p < 0.001).

Conclusions

Our systematic review identified 32 articles describing curricula and initiatives used by GME programs to increase scholarly activity. The three most frequently reported initiatives were mentors (88%), curriculum (59%), and protected time (59%). Although no specific strategy was identified as paramount to improved productivity, meta-analysis revealed that the publication rate was significantly higher following the implementation of an initiative. Thus, we conclude that a culture of emphasis on resident scholarship is the most important step. We call for well-designed research studies with control or comparison groups and a power analysis focused on identifying best practices for future scholarly activities curricula and initiatives.

Peer Review reports

Background

The Accreditation Council for Graduate Medical Education (ACGME) mandates that residents “should participate in scholarly activity” and that “[t]he sponsoring institution and program should allocate adequate educational resources to facilitate resident involvement in scholarly activities” [1]. Such broadly-defined requirements leave individual residencies to interpret and execute scholarly activities within their program in varying ways. However, this can lead to a wide diversity in residency curricula, programs, outcomes, and experiences between residency programs. As such, it is important to identify high-yield practices of successful programs so that they may be tailored to other residency programs.

It has been shown that increased research exposure and experience leads to increased fellowship acceptance and opportunities [2, 3]. Providing residents with the tools to succeed in their scholarly activities promotes the long-term benefit of producing well-rounded clinicians. Even residents who choose not to pursue academic careers will benefit from an improved ability to critically assess medical literature [4].

Given the stated importance of resident scholarly activities, medical educators are faced with the difficult task of implementing curricula and initiatives supporting residents through their scholarly experience. The availability and promotion of scholarship vastly differs between residency programs, even within the same subspecialty [5,6,7]. Differing opportunities for time allotted, faculty involvement, and relevant curricula can make for highly varied experiences and outcomes for residents. Therefore, it is important to describe the efforts of successful programs.

Our initial review of the literature, identified one systematic review published in 2003, focused on resident research curricula only [8]. Hebert and colleagues identified 41 articles and summarized instructional methods, goals, and objectives, as well as obstacles encountered in implementing resident research curricula [8]. They concluded that the lack of detailed developmental information and meaningful evaluations hinders educators interested in adopting a new research curriculum [8]. We set out to conduct a systematic review of the literature to extend these results beyond 2003 and to include all initiatives designed to increase intern, resident, or fellow scholarly activity.

Methods

Literature search

A medical librarian (ILV), who has participated in multiple systematic reviews, conducted a comprehensive literature search for English language articles published on research curricula and scholarly activities initiatives in Graduate Medical Education (GME) between January 1, 2003 and March 31, 2017 in PubMed (National Library of Medicine), EMBASE (Elsevier), and Scopus (Elsevier) databases. We chose relevant controlled vocabulary and keywords to locate GME articles focused on scholarly activities and research curricula (Additional file 1).

From the searches, 2980 unique articles were obtained. Two anesthesiology-trained authors (WW and PK) independently reviewed all titles and abstracts when available. The percent agreement on initial independent selection of articles for further review was 98.9%. Inter-rater reliability using Cohen’s Kappa was κ = 0.897, p < 0.001.

When either reviewer selected an article, the full text was ordered for further review by trained research assistants (JM and AM). Research assistants also checked the reference sections of all included articles to identify other relevant studies. Using this strategy, 197 articles were obtained and reviewed for possible inclusion. All (197) articles were independently reviewed by the trained research assistants (JM and AM) to determine eligibility for inclusion. In rare cases where a disagreement occurred, the full text was reviewed by a third team member (LAR). This was followed by a team discussion of the article, where a final inclusion decision was made. After this review process, 32 articles were identified that met our protocol criteria (Fig. 1).

Fig. 1
figure 1

Systematic Review of the Literature on Scholarship Initiatives in Graduate Medical Education (January 2003–March 2017) Study Selection Process

Inclusion and exclusion criteria

We developed a comprehensive systematic review protocol containing operational definitions and inclusion/exclusion criteria. The operational definitions included: 1) Scholarly Activities Curriculum: defined as instruction, teaching, didactic, seminars, or workshops developed and implemented with the goal of increasing scholarly or research outcomes/productivity; 2) Initiative: defined as any activity, tactic, or intervention (e.g., role models, mentors, protected time, journal club, or project funding) implemented to improve or increase scholarly or research outcomes/productivity; 3) Major scholarly activities outcomes/productivity: defined as regional, national, or international presentation(s) and/or publication(s); 4) Participants: defined as interns, residents, or fellows in graduate medical education programs; 5) Strategies: defined as procedures or processes that the author describes as being imperative, key, or a major contributor to their study’s success; 6) Barriers: defined as obstacles or problems the author(s) described as being a hindrance or impediment to their study’s success.

Articles meeting the following criteria were eligible for review: published between January 1, 2003 and March 31, 2017; English language; studied interns, residents, or fellows of any graduate medical education discipline; implemented a curriculum or activity designed to increase scholarly activities outcomes/productivity; and presented results using a control or comparison group. Exclusion criteria included: letters to the editor, commentaries, editorials, or newsletter articles; articles that did not include a description of implementation with outcomes data; or did not measure presentations and/or publications.

Abstraction process

Two trained reviewers (JM and AM) individually evaluated all selected articles to ascertain the study’s purpose, quality, and results. Information pertinent to the systematic review was independently abstracted, organized, and added to a spreadsheet for further assessment. Monthly meetings were held with a separate author (LAR) to review, revise, and validate the extracted data. During the course of the meetings a finalized document of abstractions was created. All abstraction disagreements were minor and were resolved during discussion between the reviewers.

Quality assessment

We used the Medical Education Research Study Quality Instrument (MERSQI) developed by Reed et al. [9] to assess article quality. It is an 18-point, 6-domain instrument designed specifically for medical education research. The 6 domains are study design, sampling, type of data, validity of assessment instruments’ scores, data analysis, and outcomes evaluated. Since its introduction in 2007, multiple studies have shown evidence of its validity and reliability [9,10,11]. As described in its original use [9], the total MERSQI score was calculated as the percentage of total achievable points. This percentage was then adjusted to a standard denominator of 18 to allow for comparison of MERSQI scores across studies. One item on the MERSQI rates “type of data.” The scoring choices are “subjective, assessment by study participant” = 1 and “objective measurement” = 3. If a study measured both subjective and objective data, it was given 3 points for objective data.

Types of data reported

The authors abstracted the following data from the selected articles: first author’s last name, year published, study location; study design, sample, and participants; program/interventions; curricula (yes/no); protected time for scholarship (yes/no); mentors (yes/no); support (yes/no); funding (yes/no); journal club (yes/no); grant writing assistance/guidance (yes/no); mandatory requirement, such as participation, attendance, completion of project, involvement in activities (yes/no); major outcomes; and price/cost. The authors defined major outcomes as publications authored or co-authored by participants or presentations at regional, national or international conferences made by participants. In addition, the authors compiled a list of strategies and barriers mentioned in the abstract or discussion sections of the included studies.

Qualitative analysis of barriers and strategies

Barriers and strategies mentioned in either the abstract or discussion sections of the included articles were abstracted and listed in phrase format. Reviewers (JM and AM) independently created their respective lists. Reviewers then met to discuss and come to consensus on final, comprehensive lists: one for barriers and a second for strategies.

Reviewers (JM and AM) then used an immersive iterative process of content analysis [12] to identify themes and create relevant category labels. Another author (LAR) and JM then used a second iterative process to finalize category and subcategory labels.

Statistical analysis

Descriptive statistics were used to report counts and percentages of initiatives. For each program, a publication rate, defined as the number of publications per participant per year, was estimated for the pre- and post-initiative periods. A publication rate ratio (PRR) was then calculated by dividing the post-initiative publication rate by the pre-initiative rate. A PRR greater than 1 indicates that the publication rate increased in the post-initiative period. Random effects meta-analysis [13] was then used to obtain a pooled estimate of the PRR. Meta-regression was also used to assess whether any specific initiatives were significantly associated with the publication rate.

Possible publication bias was assessed using the Egger test and with funnel plots [14]. A two-tailed p-value < 0.05 was considered statistically significant. All statistical analyses were performed by using R statistical software, version 3.0 [15, 16].

Results

Our systematic review of the literature on GME scholarly activities initiatives identified 32 articles published between 2006 and 2017 (Table 1) [3, 7, 17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46]. Subjects for these articles were collected from 1989 to 2017, with 23/32 (72%) including subjects from 2000 forward [17, 18, 21,22,23,24,25,26, 28,29,30, 32, 34,35,36,37, 39,40,41,42,43, 45, 46]. Of the 32 articles in this review, 28 (88%) were United States-based [3, 7, 17,18,19,20,21, 23,24,25, 27,28,29,30,31, 33,34,35,36,37,38,39, 41,42,43,44,45,46] and 2 (6%) were Canada-based [22, 40]. The remaining locations, Germany [32] and India [26] had 1 (3%) article each. The disciplines studied were internal medicine (7, 22%), [18, 25, 27, 28, 32, 41, 42] orthopedic surgery (6, 19%), [29, 30, 33, 34, 38, 46] general surgery (5, 16%) [21,22,23, 31, 36], pediatrics (3, 9%) [3, 7, 40], family medicine (3, 9%) [17, 24, 45], neurology (2, 6%) [26, 39] and obstetrics and gynecology (2, 6%), [19, 37] with otolaryngology [20], gastroenterology [35], anesthesiology [43], and pulmonary critical care [44] each in one article. Nineteen (19/32, 59%) studies reported sample sizes (number of participants) [3, 7, 17, 21, 22, 25, 27, 28, 30, 32,33,34,35,36, 38,39,40, 43, 46] ranging from 25 [17] to 527 [7], with 10/19 (53%) having sample sizes less than 100 [17, 21, 25, 27, 32, 34, 35, 38, 40, 46]. Article quality scores ranged from 9.6–13.2, with a mean of 11.

Table 1 Brief summary of articles included in a systematic review of scholarship initiatives in graduate medical education, 2003-March 2017

Initiatives

Nineteen (19/32, 59%) articles included a curriculum focused on research topics [3, 17, 18, 22, 23, 25, 27,28,29, 31, 32, 34,35,36, 38, 40, 42,43,44]. Eight articles provided 4–28 lectures (60 to 120 min each) offered over 4 weeks to two years [17, 22, 25, 27, 31, 35, 42, 44]. The entire didactic experience could encompass a 2 week block [18] or continue throughout residency [3].

Sixteen (84%) of the studies with curriculum provided details about the didactic topics covered [17, 18, 22, 23, 25, 27, 29, 31, 32, 34, 35, 38, 40, 42,43,44]. Of these, statistics (11/16, 69%) [18, 22, 23, 25, 29, 31, 32, 34, 35, 38, 42] and research design (12/16, 75%) [17, 18, 23, 25, 27, 29, 31, 32, 34, 38, 40, 42] were the most frequently reported, followed by critical appraisal of the literature (6/16, 38%), [25, 29, 31, 32, 38, 42] Institutional Research Board (IRB) and ethics (7/16, 44%), [17, 18, 23, 27, 34, 38, 42] epidemiology (3/16, 19%), [18, 22, 27] searching the literature (3/16, 19%), [25, 35, 42] and formal writing (3/16, 19%) [32, 35, 42]. Less frequently reported topics included: research pearls [42], outcomes research [36], critical thinking [31], funding options [43], research career advancement [42], how to get your article published [35], tips for completing research projects [42], and overcoming procrastination [42].

Under half of the studies with curriculum (7/19, 37%) reported using needs assessment prior to the development of their research curriculum. Two conducted surveys [25], and one program held a faculty retreat [35] to construct their needs assessment. Another used both interviews and a committee [40]. The remaining 4 studies (21%) used a committee or team that discussed and developed the curriculum [17, 34, 38, 42]. Ten (10/19, 53%) studies evaluated the curriculum they implemented via surveys [17, 18, 22, 25, 28, 32, 33, 35, 44], quizzes/exams [22, 32] and/or interviews [17, 35, 38]. All but one of these articles reported results of the evaluation [38]. All surveys and interviews reporting participant satisfaction and/or confidence demonstrated support for the research curriculum [17, 22, 25, 28, 32, 35, 44], except one, which had mixed results [18]. The two studies testing knowledge [22, 32] compared results of a control group to an intervention group (received curriculum) showing statistically significant results in favor of the curriculum. In addition, one of these studies used a pre-post design as well as the comparison to a control group [32].

Most included articles (31/32, 97%) used multiple interventions with the goal of increasing scholarly productivity [3, 17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46]. The one article that did not use multiple interventions used an annual research day to stimulate an increase in research productivity [7]. The number of interventions ranged from 1 [7] to 8 [40] [42], (mean 4.0 ± 1.7; median 4).

The majority of studies provided residents with mentors (28/32, 88%) [3, 17,18,19, 22,23,24,25,26,27,28,29, 31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46]. Over half incorporated protected time (19/32; 59%) [18,19,20,21,22, 24,25,26,27,28,29,30, 33, 35, 38, 40,41,42,43] However, protected time differed between residencies and ranged from a 2 week rotation [25] to a 1 year research elective. [21, 33] Fifty-six percent included a mandatory initiative (18/32), such as required attendance [18, 22], participation [3, 21, 23, 24, 26, 30, 32], or completion of a project [17, 19, 25, 27, 36, 40, 42, 44, 45]. Journal clubs were described in 41% (13/32) of studies [3, 18, 24, 25, 27, 31, 38,39,40,41,42,43,44] and 31% (10/32) provided assistance or guidance on grant writing and/or application. [18, 19, 29, 31, 34, 37, 38, 40, 42, 43] Funding was available for participants in only 25% (8/32) of studies. [20, 29, 31, 33, 38, 40, 42, 46]

While almost half of the studies (15/32, 47%) provided some information relevant to cost of the program (Table 1) [7, 19, 20, 23, 29, 31, 32, 34,35,36,37,38, 40,41,42], these statements tended to be vague failing to address the critical factors of feasibility and sustainability. The most detailed description came from, Robbins et al. [38] who approximated their per-year costs to be $19,000 for an academic research coordinator, $16,000 for resident travel to professional meetings, reimbursement for 213 faculty hours and funding for resident salaries while on the research rotation. Unfortunately, even these costs are outdated, as they came from expenses incurred during academic years 2007 to 2010.

Major outcomes

Our primary outcomes were publications and presentations. However, only 25% (8/32) of articles explicitly required participants to achieve a specific outcome such as submission of a scholarly manuscript [29, 34, 38, 39, 46] or a regional, national, or international presentation [31, 35, 41]. Despite this, the majority of articles (29/32, 91%) reported on resident publications, [3, 7, 19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36, 38,39,40,41,42,43,44,45,46] with 28/29 (97%) reporting on peer-reviewed publications [3, 7, 19,20,21,22, 24,25,26,27,28,29,30,31,32,33,34,35,36, 38,39,40,41,42,43,44,45,46]. Of those, 10 (36%) reported a statistically significant increase in their publication rate after implementation or changes made to a scholarship initiative [3, 7, 20, 22, 32, 33, 36, 41, 43, 46]. More than half (16/28; 57%) of the publications were reported as original research, [21, 22, 25,26,27,28, 30, 32,33,34,35, 39, 41, 43, 45, 46] 6/28 (21%) as case reports, [26, 31, 34, 39, 43, 45] and only 5/28 (18%) as book chapters [27, 32, 39, 43, 45]. Fifteen (15/32, 47%) articles reported on regional, national, or international presentations [17, 18, 22,23,24, 29, 31, 32, 34, 36, 37, 39, 40, 42, 45], with only 2/15 (13%) reporting a statistically significant increase in presentation rates [22, 36]. One article combined publication and presentation rates to obtain statistical significance [39]. Overall, 30 articles reported a positive increase in resident publications and/or presentations after implementation of a scholarly activity initiative [3, 7, 17,18,19,20, 22,23,24,25,26,27,28,29, 31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46]. However, 21/32 (66%) of the included articles did not report a statistically significant increase in either presentations or publications [17,18,19, 21, 23,24,25,26,27,28,29,30,31, 34, 35, 37, 38, 40, 42, 44, 45].

Meta-analysis

Nineteen of the 32 articles (59%) provided enough detail to calculate a publication rate ratio [3, 7, 21, 23,24,25, 28,29,30, 32,33,34, 36, 38, 40,41,42,43, 46]. Two studies reported the percentage of participants who published, but not the overall number of publications; for these studies we assumed one publication per participant [7, 33]. The PRR for these studies ranged from 0.6 to 25, with eight studies having PRR significantly greater than one, indicating that for these programs, the post-intervention publication rates were significantly higher than in the pre-intervention period [21, 23, 24, 28, 40,41,42,43]. Overall, the publication rate was significantly higher following implementation of initiatives (p < 0.001; Fig. 2); we estimate that the post-initiative publication rate was 2.6 times (95% CI: 1.6 to 4.3 times) the pre-intervention rate, or a 160% increase.

Fig. 2
figure 2

Forest plot for publication rate ratio in a systematic review of the literature on scholarship initiatives in graduate medical education (January 2003–March 2017)

Of the eight initiatives (mentors, curriculum, protected time, a mandatory component, journal club, grant writing guidance/assistance, funding, and support staff) identified in the included studies, mentoring, curriculum, and protected time were offered most frequently. For this reason, these three initiatives were selected for inclusion in the meta-regression to compare the PRR for programs providing those initiatives to those that did not. Sixteen of the 19 (84%) included programs provided mentors [3, 23,24,25, 28, 29, 32,33,34, 36, 38, 40,41,42,43, 46]. For programs that provided mentors, the post-initiative publication rate was estimated to be 3.2 times (95% CI: 1.92 to 5.23 times; p < 0.001) the pre-initiative publication rate, while the pre- and post-initiative publication rates for programs not providing mentors did not significantly differ (p > 0.20) (Fig. 3). However, the difference in the publication rate ratios for these two groups did not reach statistical significance (p = 0.10). Programs that provided curriculum (12/19 or 63% of included studies) or protected time (63% of included studies) also did not have significantly higher PRR than programs that did not use these initiatives (p > 0.20, for both).

Fig. 3
figure 3

Forest plot comparing publication rate ratio for programs that provided mentors to programs that did not provide mentors in a systematic review of the literature on scholarship initiatives in graduate medical education (January 2003–March 2017)

Publication Bias

The Egger test did not detect any evidence of publication bias (p > 0.20). However these results should be interpreted with caution, as it is likely that residency programs that did not see improvements in productivity, or saw a decline in productivity, might be less inclined to publish their data.

Barriers

We identified 43 barriers that could be organized into 6 major categories (Table 2). The most frequently reported barriers were lack of: time (17/43, 40%), mentoring/oversight (10/43, 23%), and support (6/43, 14%).

Table 2 Systematic review of scholarship initiatives in graduate medical education (2003-March 2017): barriers and strategies found in included article abstracts and/or discussions

Strategies

We identified 117 strategies that could be organized into 9 major categories (Table 2). Of the strategy categories, providing curriculum was the most frequently reported (22/117, 19%), followed by mentorship (19/117, 16%), and infrastructure and departmental support (16/117, 14%).

Discussion

This systematic review of GME scholarly activities initiatives identified 32 relevant articles published after 2006. All included articles demonstrated improvements in resident productivity in regards to publications or presentations. Unfortunately, many of these articles (66%) failed to specify whether or not their improvements were statistically significant.

Most included articles used multiple interventions (97%), with providing mentors reported most often (88%). Research curricula or protected time were both provided in over half of studies (59%). However, there was wide variability in both curricula and protected time provided.

Programs with curricula included workshops, a lecture series, or research seminars, with much variability in length and content of sessions. The most frequently taught topics were research design (75%) and statistics (69%). The remaining topics were covered by less than half of programs with research curricula. Programs providing protected time varied from a 2-week rotation to a year-long rotation. This wide range in time commitment dedicated to scholarly activities curriculum and protected time to complete projects makes it impossible to make direct comparisons between programs.

A systematic review of research curricula published over a decade ago concluded that “successful educational interventions should incorporate needs assessment, clearly defined learning objectives, and evaluation methods” [8].,p61 Despite this call published in 2003, we found little progress made in the inclusion of needs assessment, objectives, and curricular evaluation. Hebert and colleagues [8] found that only 27% of included articles had a needs assessment. In our review, we found that this percent had increased to 37%, which is an improvement but still falls short of ideal. Likewise, we found a lack of curricular evaluation, with only half (53%) providing evaluation data. Similar to our findings of 11%, Hebert et al. [8], found that 12% of studies used an objective pre-post knowledge test.

Our primary outcomes were presentations and publications. Only 2 of the included studies reported statistically significant increases in presentations, while 10 (36%) reported statistically significant increases in publications. All studies with a research curriculum reported increases in presentations and/or publications. However, only 32% of studies with curricula reported statistically significant improvements. This is possibly due to small sample sizes and resulting lack of power to detect differences. In the future, more robust study designs with larger sample sizes are needed to definitively assess the importance of inclusion of a research curriculum.

Of the remaining interventions, funding was reported in a quarter of the studies. However, the amount of funding varied widely across studies. In fact, all initiative interventions varied a great deal across studies, including length of curriculum, amount of protected time and mentoring provided.

We identified nine categories of strategies commonly reported as important (Table 2). Structured research curricula, faculty mentorship, and providing departmental infrastructure and support were the most commonly cited strategies. Strategies that we identified were similar to those identified by Hebert et al. [8] in their 2003 systematic review of residency research curricula. In their article, they described common curriculum components such as educational goals and objectives, lectures, seminars or small groups; role models; and research mentors.

In addition, we identified barriers to research output noted in the reviewed articles. The most frequently cited obstacle when implementing changes to resident research was lack of time, due to clinical responsibility or the importance of educational curriculum besides research. Difficulties in providing clinical research mentors to residents, lack of resident interest in research, departmental funding towards research and challenges in providing adequate training and support were also identified in our review. Hebert et al. [8] identified obstacles encountered based on learner (e.g., resident resistance, lack of motivation), faculty (e.g., resistance, time/intensity demands, lack of motivation) and institutional (e.g., lack of time, financial barriers, lack of critical support staff). Our review yielded much similar results of institutional barriers, yet very few articles mentioned learner or faculty barriers as described by Hebert. This may reflect a change in academic department attitudes towards research.

Hebert and collegues [8] noted that many articles failed to provide descriptions of feasibility, sustainability, or cost. We found 15 (47%) articles had some mention of cost and/or feasibility of their research initiatives. However, many provided vague statements that would yield very little concrete assistance in determining actual cost or feasibility and no article directly addressed sustainability.

It appears that after implementing research initiatives in a residency, a majority of programs saw an increase in resident publications or presentations. However, we were unable to identify a particular intervention that was associated with statistically significant improvements. Whether it was faculty mentor participation, scheduled research instruction, or other initiative, it appears that any departmental dedication to resident research may increase scholarly productivity. It may be that as the overall culture within a residency moves toward supporting resident research and scholarly activity, resident publications and presentations will increase.

Limitations

As recently noted in an editorial, medical education reviews are difficult to conduct [47]. Many aspects of published medical education research vary, including study design, operational definitions, educational interventions, subjects, sample size, and outcome measures. All of these differences prevent easy aggregation of data [47]. Despite these limitations, we were able to identify 19 (59%) studies with enough detail to include in a meta-analysis of publication rate ratios.

As with any systematic review, the results are limited by the search strategy and methods used. We addressed these issues by developing a detailed protocol with operational definitions and by using multiple trained reviewers throughout the study process. Our search included 3 databases and was conducted by an experienced medical librarian. In addition, the reference sections of all included articles were reviewed for possible additional articles. Although all of these strategies improve the quality of our systematic review, we may have missed some relevant articles.

As with all systematic reviews of the literature, there exists the possibility of a publication bias against negative studies, resulting in few studies published that did not demonstrate improvements. In addition, we were only able to analyze interventions that were themselves published. There likely exist many residency programs throughout the country that have implemented or updated their resident research/scholarly activity initiatives while not explicitly publishing data on the changes and their results.

Although our meta-analysis concluded that the post-initiative publication rate was significantly higher than the pre-initiative publication rate, this result should be interpreted with caution. The design of the studies included in the analysis were varied, and it is possible that changes in the publication rate may have been due to factors outside of the implementation of an initiative. Meta-analysis results can also be sensitive to publication bias, which is likely to be present for this study.

Recommendations

When implementing or updating a resident research curriculum it is important to consider all aspects of curriculum development, including conducting a needs assessment, developing goals and objectives, and designing a robust mechanism for curriculum assessment. Further, education leadership should consider using freely available, peer-reviewed, online resources, such as MedEdPORTAL. A brief search conducted by our team yielded two teaching resources devoted to research curricula [48, 49] and another two focused on scholarly activities and research mentor resources [50, 51]. There are likely many more such resources available.

It is vital to address barriers to outcomes early and often, to avoid stagnation or poor utilization of valuable resources. In addition, future studies should provide data on cost, feasibility, and sustainability of initiatives used to improve resident scholarly activities.

Conclusions

While specific interventions designed to improve resident scholarly activity cannot be individually tied to an increase in resident productivity, it appears that a culture of research emphasis is likely the most important factor in leading to improvements in resident research productivity. However, we call for prospective studies that include a power analysis; a control or comparison group; well defined, quantifiable parameters; and high-quality design to identify best practices for future scholarly activity initiatives. Without these studies it remains difficult for residency education leadership to design cost-effective interventions proven to increase resident scholarly activities (e.g., local, regional, and national presentations and peer-reviewed publications).

Abbreviations

ACGME:

Accreditation Council for Graduate Medical Education

CDN:

Canadian dollar

CI:

Confidence interval

FTE:

Full time equivalent

GME:

Graduate medical education

IRB:

Institutional Research Board

MERSQI:

Medical Education Research Study Quality Instrument

N/A:

Not applicable

NS:

Non-significant

PGY:

Postgraduate year

PRR:

Publication rate ratio

USA:

United States of America

References

  1. Accreditation Council of Graduate Medical Education. Common Program Requirements, 2017. Available at: http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07-01.pdf. Accessed 25 Feb 2018.

  2. Byrnes AB, Mccormack FX, Diers T, Jazieh AR. The resident scholar program: a research training opportunity for internal medicine house staff. J Cancer Educ. 2007;22(1):47–9.

    Article  Google Scholar 

  3. Kurahara DK, Kogachi K, Yamane M, Ly CL, Foster JH, Masaki-Tesoro T, Murai D. Rudoy R. A pediatric residency research requirement to improve collaborative resident and faculty publication productivity. Hawaii J Med Public Health. 2012;71(8):224–8.

  4. Bishop JA. CORR insights ®: a dedicated research program increases the quantity and quality of orthopaedic resident publications. Clin Orthop Relat Res. 2015;473(4):1522–3.

    Article  Google Scholar 

  5. Ahmad S, De Oliveira GS Jr, Mccarthy RJ. Status of anesthesiology resident research education in the United States: structured education programs increase resident research productivity. Anesth Analg. 2013;116(1):205–10. https://doi.org/10.1213/ANE.0b013e31826f087d.

    Article  Google Scholar 

  6. Levi B, Longaker MT. Research training in plastic surgery. J Craniofac Surg. 2011;22(2):383–4. https://doi.org/10.1097/SCS.0b013e318208ba73.

    Article  Google Scholar 

  7. Mills LS, Steiner AZ, Rodman AM, Donnell CL, Steiner MJ. Trainee participation in an annual research day is associated with future publications. Teach Learn Med. 2011;23(1):62–7. https://doi.org/10.1080/10401334.2011.536895.

    Article  Google Scholar 

  8. Hebert RS, Levine RB, Smith CG, Wright SM. A systematic review of resident research curricula. Acad Med. 2003;78(1):61–8.

    Article  Google Scholar 

  9. Reed DA, Beckman TJ, Wright SM. An assessment of the methodologic quality of medical education research studies published in the American Journal of Surgery. Am J Surg. 2009;198(3):442–4.

  10. Yucha CB, Schneider BS, Smyer T, et al. Methodological quality and scientific impact of quantitative nursing education research over 18 months. Nurs Educ Perspect. 2011;32(6):362–8.

    Article  Google Scholar 

  11. Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002–9.

    Article  Google Scholar 

  12. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–88.

    Article  Google Scholar 

  13. DerSimonian R, Laird N. Meta-analysis in clinical trials. Control ClinTrials. 1986;7:177–88.

    Google Scholar 

  14. Egger M, Davey Smith G, Schneider M, Minder C. Bias in meta-analysis detected by a simple, graphical test. BMJ. 1997;315:629–34.

    Article  Google Scholar 

  15. Schwarzer G. Meta-analysis with R. In: R package version 2.3–0. 2013. https://CRANR-projectorg/package=meta.

  16. Viechtbauer W. Conducting meta-analyses in R with the metafor package. J Stat Soft. 2010;36(3):1–48.

    Article  Google Scholar 

  17. Anandarajah G, Gupta P, Jain N, El Rayess F, Goldman R. Scholarly development for primary care residents. Clin Teach. 2016;13(6):415–21.

    Article  Google Scholar 

  18. Basu Ray I, Henry TL, Davis W, Alam J, Amedee RG, Pinsky WW. Consolidated academic and research exposition: a pilot study of an innovative education method to increase residents' research involvement. Ochsner J. 2012;12(4):367–72.

    Google Scholar 

  19. Brackmann M, Reynolds RK, Uppal S, Mclean K. Assoctation of a biweekly research workgroup with enhanced residents research productivity. Obstet Gynecol. 2016;128(3):617–20.

    Article  Google Scholar 

  20. Chang CW, Mills JC. Effects of a reward system on resident research productivity. JAMA Otolaryngol Head Neck Surg. 2013;139(12):1285–90. https://doi.org/10.1001/jamaoto.2013.5303.

    Article  Google Scholar 

  21. Elliott ST, Lee ES. Surgical resident research productivity over 16 years. J Surg Res. 2009;153:148–51.

    Article  Google Scholar 

  22. Farrokhyar F, Amin N, Dath D, et al. Impact of the surgical research methodology program on surgical residents' research profiles. J Surg Educ. 2014;71(4):513–20. https://doi.org/10.1016/j.jsurg.2014.01.012.

    Article  Google Scholar 

  23. Fisher C, Baker MK. Improving participation and quality of clinical research in a university-based general surgery residency program. Am Surg. 2010;76(7):741–2.

    Google Scholar 

  24. Hoedebecke K, Rerucha C, Runser L. Increase in residency scholarly activity as a result of resident-led initiative. Fam Med. 2014;46(4):288–90.

    Google Scholar 

  25. Kanna B, Deng C, Erickson SN, Valerio JA, Dimitrov V, Soni A. The research rotation: competency-based structured and novel approach to research training of internal medicine residents. BMC Med Educ. 2006;6:52.

    Article  Google Scholar 

  26. Khurana D, Vishnu VY, Vinny PW, Rishi V. Influence of a publication rotation in a neurology residency program in a developing country. Neurology. 2015;84(2):197–9. https://doi.org/10.1212/WNL.0000000000001119.

    Article  Google Scholar 

  27. Kohlwes RJ, Shunk RL, Avins A, Garber J, Bent S, Shlipak MG. The PRIME curriculum. clinical research training during residency. J Gen Intern Med. 2006;21(5):506–9.

    Article  Google Scholar 

  28. Kohlwes J, O’Brien B, Stanley M, et al. Does research training during residency promote scholarship and influence career choice? A cross sectional analysis of a 10-year cohort of the UCSF-PRIME internal medicine residency program. Teach Learn Med. 2016;28(3):314–9.

    Article  Google Scholar 

  29. Konstantakos EK, Laughlin RT, Markert RJ, Crosby LA. Assuring the research competence of orthopedic graduates. J Surg Educ. 2010;67(3):129–34. https://doi.org/10.1016/j.jsurg.2010.04.002.

    Article  Google Scholar 

  30. Krueger CA, Hoffman JD, Balazs GC, Johnson AE, Potter BK, Belmont PJ. Protected resident research time does not increase the quantity or quality of residency program research publications: a comparison of 2 orthopedic residencies. J Surg Educ. 2017;74(2):264–70.

    Article  Google Scholar 

  31. Lohr J, Smith JM, Welling R, Engel A, Hasselfeld K, Rusche J. Stimulating resident research in a general surgery residency community program. Curr Surg. 2006;63(6):426–34.

    Article  Google Scholar 

  32. Löwe B, Hartmann M, Wild B, Nikendei C, Kroenke K, Niehoff D, Henningsen P, Zipfel S, Herzog W. Effectiveness of a 1-year resident training program in clinical research: a controlled before-and-after study. J Gen Intern Med. 2008;23(2):122–8.

    Article  Google Scholar 

  33. Macknin JB, Brown A, Marcus RE. Does research participation make a difference in residency training? Clin Orthop Relat Res. 2014;472(1):370–6.

    Article  Google Scholar 

  34. Manring MM, Panzo JA, Mayerson JL. A framework for improving resident research participation and scholarly output. J Surg Educ. 2014;71(1):8–13. https://doi.org/10.1016/j.jsurg.2013.07.011.

    Article  Google Scholar 

  35. Mayo MJ, Rockey DC. Development of a successful scholarly activity and research program for subspecialty trainees. Am J Med Sci. 2015;350(3):222–7. https://doi.org/10.1097/MAJ.0000000000000489.

    Article  Google Scholar 

  36. Papasavas P, Filippa D, Reilly P, Chandawarkar R, Kirton O. Effect of a mandatory research requirement on categorical resident academic productivity in a university-based general surgery residency. J Surg Educ. 2013;70(6):715–9. https://doi.org/10.1016/j.jsurg.2013.09.003.

    Article  Google Scholar 

  37. Penrose LL, Yeomans ER, Praderio C, Prien SD. An incremental approach to improving scholarly activity. J Grad Med Educ. 2012;4(4):496–9. https://doi.org/10.4300/JGME-D-11-00185.1.

    Article  Google Scholar 

  38. Robbins L, Bostrom M, Marx R, Roberts T, Sculco TP. Restructuring the orthopedic resident research curriculum to increase scholarly activity. J Grad Med Educ. 2013;5(4):646–51. https://doi.org/10.4300/JGME-D-12-00303.1.

    Article  Google Scholar 

  39. Robbins MS, Haut SR, Lipton RB, et al. A dedicated scholarly research program in an adult and pediatric neurology residency program. Neurology. 2017;88(14):1366–70.

    Article  Google Scholar 

  40. Roth DE, Chan MK, Vohra S. Initial successes and challenges in the development of a pediatric resident research curriculum. J Pediatr. 2006;149(2):149–50.

    Article  Google Scholar 

  41. Rothberg MB, Kleppel R, Friderici JL, Hinchey K. Implementing a resident research program to overcome barriers to resident research. Acad Med. 2014;89(8):1133–9. https://doi.org/10.1097/ACM.0000000000000281.

    Article  Google Scholar 

  42. Ruiz J, Wallace EL, Miller DP, Loeser RF, Miles M, Lichstein PR. A comprehensive 3-year internal medicine residency research curriculum. Am J Med. 2011;124(5):469–73. https://doi.org/10.1016/j.amjmed.2011.01.006.

    Article  Google Scholar 

  43. Sakai T, Emerick TD, Metro DG, Patel RM, Hirsch SC, Xu Y. Facilitation of resident scholarly activity: strategy and outcome analyses using historical resident cohorts and a rank-to-match population. Anesthesiology. 2014;120(1):111–9. https://doi.org/10.1097/ALN.0000000000000066.

    Article  Google Scholar 

  44. Schnapp LM, Vaught M, Park DR, Rubenfeld G, Goodman RB, Hudson LD. Implementation and impact of a translational research training program in pulmonary and critical care medicine. Chest. 2009;135(3):688–94. https://doi.org/10.1378/chest.08-1449.

    Article  Google Scholar 

  45. Seehusen DA, Asplund CA, Friedman M. A point system for resident scholarly activity. Fam Med. 2009;41(7):467–9. https://doi.org/10.1378/chest.08-1449.

    Article  Google Scholar 

  46. Torres D, Gugala Z, Lindsey RW. A dedicated research program increases the quantity and quality of orthopaedic resident publications. Clin Orthop Relat Res. 2015;473(4):1515–21. https://doi.org/10.1007/s11999-014-4080-1.

    Article  Google Scholar 

  47. Sullivan GM. Why are medical education literature reviews so hard to do? J Grad Med Educ. 2018;10(5):481–5.

    Article  Google Scholar 

  48. Onishi K, Karimi DP, Hata J, et al. The badges program: a self-directed learning guide for residents for conducting research and a successful peer-reviewed publication. MedEdPORTAL. 2016;12:10443. https://doi.org/10.15766/mep_2374-8265.10443.

    Article  Google Scholar 

  49. Abramson E, Bostwick S, Green C, DiPace J. A longitudinal residency research curriculum. MedEdPORTAL. 2013;9:9496. https://doi.org/10.15766/mep_2374-8265.9496.

    Article  Google Scholar 

  50. Fenton K, Kim J, Abramson E, et al. Mentoring resident scholarly activity: a toolkit and guide for program directors, research directors and faculty mentors. MedEdPORTAL. 2015;11:10103. https://doi.org/10.15766/mep_2374-8265.10103.

    Article  Google Scholar 

  51. Green C, Li S, Jirasevijinda T, et al. Research mentorship for medical trainees across the continuum: a faculty toolkit. MedEdPORTAL. 2015;11:10132. https://doi.org/10.15766/mep_2374-8265.10132.

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to acknowledge the work of Amos J. Wright, a medical librarian in the Department of Anesthesiology and Perioperative Medicine, University of Alabama at Birmingham, Birmingham, AL and Emma O′ Hagen an assistant professor of clinical education and research librarian in the Department of Anesthesiology and Perioperative Medicine.

Some preliminary data was presented at the Annual Research and Innovations Medical Education Conference, at the University of Alabama at Birmingham, Birmingham Alabama, September 21, 2016.

Funding

Partial funding for research assistants was provided by the Department of Anesthesiology and Perioperative Medicine, University of Alabama at Birmingham, Birmingham, AL. The Department of Anesthesiology and Perioperative Medicine at the University of Birmingham played no role in the design of the study; collection, analysis, and interpretation of data; or in writing the manuscript.

Availability of data and materials

Full search strategy available as an online supplement. The manuscript contains all other data.

Author information

Authors and Affiliations

Authors

Contributions

WW made substantial contributions to conception and design, and analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. JM made substantial contributions to analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. PK made substantial contributions to conception and design, and analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. ILV made substantial contributions to conception and design; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. CJM made substantial contributions to analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. AHZM made substantial contributions to analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. LAR made substantial contributions to conception and design, and analysis and interpretation of data; was involved in drafting the manuscript and revising it critically for important intellectual content; gave final approval of the version to be published; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. All authors have read and approved the manuscript.

Corresponding author

Correspondence to Lee Ann Riesenberg.

Ethics declarations

Ethics approval and consent to participate

Not applicable as this is a systematic review of the published literature.

Consent for publication

Not applicable.

Competing interests

The authors declare they have no competing interest.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

GME Scholarship Initiatives Search Strategy Full electronic search strategy for thee databases: PubMed, Embase, and Scopus. (DOCX 17 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wood, W., McCollum, J., Kukreja, P. et al. Graduate medical education scholarly activities initiatives: a systematic review and meta-analysis. BMC Med Educ 18, 318 (2018). https://doi.org/10.1186/s12909-018-1407-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12909-018-1407-8

Keywords