Elsevier

Journal of Surgical Education

Volume 71, Issue 6, November–December 2014, Pages e41-e46
Journal of Surgical Education

2014 APDS SPRING MEETING
Assessment of Resident Operative Performance Using a Real-Time Mobile Web System: Preparing for the Milestone Age

https://doi.org/10.1016/j.jsurg.2014.06.008Get rights and content

Objective

To satisfy trainees’ operative competency requirements while improving feedback validity and timeliness using a mobile Web-based platform.

Design

The Southern Illinois University Operative Performance Rating Scale (OPRS) was embedded into a website formatted for mobile devices. From March 2013 to February 2014, faculty members were instructed to complete the OPRS form while providing verbal feedback to the operating resident at the conclusion of each procedure. Submitted data were compiled automatically within a secure Web-based spreadsheet. Conventional end-of-rotation performance (CERP) evaluations filed 2006 to 2013 and OPRS performance scores were compared by year of training using serial and independent-samples t tests. The mean CERP scores and OPRS overall resident operative performance scores were directly compared using a linear regression model. OPRS mobile site analytics were reviewed using a Web-based reporting program.

Setting

Large university-based general surgery residency program.

Participants

General Surgery faculty used the mobile Web OPRS system to rate resident performance. Residents and the program director reviewed evaluations semiannually.

Results

Over the study period, 18 faculty members and 37 residents logged 176 operations using the mobile OPRS system. There were 334 total OPRS website visits. Median time to complete an evaluation was 45 minutes from the end of the operation, and faculty spent an average of 134 seconds on the site to enter 1 assessment. In the 38,506 CERP evaluations reviewed, mean performance scores showed a positive linear trend of 2% change per year of training (p = 0.001). OPRS overall resident operative performance scores showed a significant linear (p = 0.001), quadratic (p = 0.001), and cubic (p = 0.003) trend of change per year of clinical training, reflecting the resident operative experience in our training program. Differences between postgraduate year-1 and postgraduate year-5 overall performance scores were greater with the OPRS (mean = 0.96, CI: 0.55-1.38) than with CERP measures (mean = 0.37, CI: 0.34-0.41). Additionally, there were consistent increases in each of the OPRS subcategories.

Conclusions

In contrast to CERPs, the OPRS fully satisfies the Accreditation Council for Graduate Medical Education and American Board of Surgery operative assessment requirements. The mobile Web platform provides a convenient interface, broad accessibility, automatic data compilation, and compatibility with common database and statistical software. Our mobile OPRS system encourages candid feedback dialog and generates a comprehensive review of individual and group-wide operative proficiency in real time.

Introduction

Surgical residents must demonstrate operative proficiency over the course of residency training to be considered competent to operate independently. Conventional end-of-rotation performance (CERP) assessments, in which faculty members recall general impressions of resident operative performance, do not account for the different types of operations residents perform, and they lack objective technical standards. Governing bodies in surgical education have identified a need for more thorough performance evaluations during residency training.

The Next Accreditation System (NAS) from the Accreditation Council for Graduate Medical Education features the Milestones Project, an evaluation template that systematically assesses the 6 core competencies, including technical and nontechnical operative performance.1 The American Board of Surgery (ABS) mandates 6 operative skills evaluations per resident in the 2015 to 2016 academic year for certification eligibility. The ABS suggests the Operative Performance Rating Scale (OPRS) developed by Southern Illinois University as a model evaluation system.2 The OPRS assesses multiple parameters of resident operative performance and has undergone thorough analyses of reliability and validity.3, 4, 5, 6, 7, 8

As residents and surgical educators routinely use mobile technology and cloud-based computing, we transformed the OPRS into a mobile, Web-based survey system. We hypothesized that this system would improve quality and timeliness of resident operative performance evaluation, facilitate feedback communication, and ensure compliance with the ABS and NAS Milestones requirements.

Section snippets

Study Design

This study was performed in an urban academic general surgery residency training program at the David Geffen School of Medicine of the University of California, Los Angeles between March 2013 and February 2014. Faculty participants included 43 surgeons, representing 5 clinical divisions and 4 affiliated institutions. Faculty whose practice did not include direct observation of resident operative performance were excluded from this study. Resident subjects included 66 categorical, designated

Results

Over the 11-month study period, 18 of the 43 eligible faculty members, representing 4 of the 5 clinical divisions and all 4 affiliated institutions, completed 176 OPRS forms evaluating 37 of the 66 eligible residents (56%). The study period included portions of 2 academic years. Faculty participants included 9 junior (assistant professors) and 9 senior (professors and associate professors) members. The numbers of junior faculty submissions (mean = 5.2) were greater than those of seniors (mean =

Discussion

Our study demonstrates that mobile OPRS evaluations offer timely performance feedback to surgical residents. These mobile OPRS reports fulfill the NAS and ABS minimum requirements of individual operative performance measures. The frequency and quality of mobile evaluations result in improved feedback dialog among the educators and trainees. Using this innovative approach, 56% of our trainees received procedure-specific feedback at an average of 169 minutes after procedure completion.

Compared

Conclusion

Mobile technology represents a new state-of-the-art tool with considerable potential in surgical education. Our mobile OPRS interface satisfies the ABS and NAS General Surgery Milestones Project requirements, stimulates real-time feedback dialog, and documents resident skills acquisition with high quality and efficiency.

Cited by (0)

View full text