Elsevier

Journal of Surgical Education

Volume 69, Issue 4, July–August 2012, Pages 511-520
Journal of Surgical Education

Original report
Reliable and Valid Tools for Measuring Surgeons' Teaching Performance: Residents' vs. Self Evaluation

https://doi.org/10.1016/j.jsurg.2012.04.003Get rights and content

Background

In surgical education, there is a need for educational performance evaluation tools that yield reliable and valid data. This paper describes the development and validation of robust evaluation tools that provide surgeons with insight into their clinical teaching performance. We investigated (1) the reliability and validity of 2 tools for evaluating the teaching performance of attending surgeons in residency training programs, and (2) whether surgeons' self evaluation correlated with the residents' evaluation of those surgeons.

Materials and Methods

We surveyed 343 surgeons and 320 residents as part of a multicenter prospective cohort study of faculty teaching performance in residency training programs. The reliability and validity of the SETQ (System for Evaluation Teaching Qualities) tools were studied using standard psychometric techniques. We then estimated the correlations between residents' and surgeons' evaluations.

Results

The response rate was 87% among surgeons and 84% among residents, yielding 2625 residents' evaluations and 302 self evaluations. The SETQ tools yielded reliable and valid data on 5 domains of surgical teaching performance, namely, learning climate, professional attitude towards residents, communication of goals, evaluation of residents, and feedback. The correlations between surgeons' self and residents' evaluations were low, with coefficients ranging from 0.03 for evaluation of residents to 0.18 for communication of goals.

Conclusions

The SETQ tools for the evaluation of surgeons' teaching performance appear to yield reliable and valid data. The lack of strong correlations between surgeons' self and residents' evaluations suggest the need for using external feedback sources in informed self evaluation of surgeons.

Introduction

To keep up with advances in technology and the ever-growing body of knowledge, the dynamic field of surgery must adapt their surgical education to maintain professional performance. While surgical research has a robust history, educational research in surgery is relatively new.1 To improve education in surgery, high quality educational research is needed.1, 2 In this paper, we aim to describe the development and validation of robust tools that provide surgeons with insight into their teaching performance.

To maintain or enhance professional performance, surgeons need accurate performance feedback.3, 4 However, a review study showed that there is often a lack of accurate performance feedback and, consequently, the ability of physicians to self-evaluate and improve their clinical performance is generally poor.5 In addition, it has been stated that humans are by nature poor in performing accurate self evaluation and that personal unguided reflections on practice simply do not provide the information that is sufficient to guide performance improvements adequately.6 Therefore, it has been suggested that external feedback data sources could inform and subsequently improve surgeons' self evaluation.3, 7, 8 Tools that facilitate the flow of feedback to inform self evaluation of teaching performance could help surgeons to evaluate their teaching performance more accurately.3, 7, 9 There are several tools that could evaluate surgeons' teaching performance but, overall, they lack validity and are not specifically developed and validated for the use in surgical specialties.10 The development and validation of tools specifically for surgical education is essential to generate credible and applicable results.

For that purpose, we aim to test the reliability and validity of the System for Evaluation of Teaching Qualities (SETQ) constructed for surgical specialties (see Fig. 1). Beside the evaluation of surgeons' general teaching performance, this system aims to evaluate surgery-specific teaching performance, such as teaching technical skills and teaching in the operation theater. Similar systems were developed and validated for some nonsurgical specialties.11, 12, 13 This study aims to evaluate (1) if the SETQ tools for surgical specialties are valid and reliable, and (2) if surgeons' and residents' evaluations of surgeons' teaching performance match.

Section snippets

Development of the SETQ for Surgical Specialties

To fulfill the demand from several residency programs, the SETQ was developed as a dynamic system for continuous evaluation and development of surgeons involved in teaching residents. After it was piloted in 1 department,12 it was successfully implemented institution-wide and later nation-wide. Currently, SETQ is used in over 160 residency training programs (of various specialties) in 34 teaching hospitals, involving around 2300 clinical teachers and 2200 residents in The Netherlands. For a

Results

In this study, 302 (87% response rate) surgeons and 269 (84% response rate) residents participated (Table 1). Surgeons' characteristics are described in Table 1. Responding residents were equally divided over the residency training years (first year: 17.8%, second: 15.3%, third: 15.3%, fourth: 16.5%, fifth: 21.1%, higher than fifth: 14%). About 44% of the residents were female.

For both SETQ tools, the principal components analysis revealed a 5-scale structure of teaching performance. In line

Main Findings

This study was set out to study the reliability and validity of the SETQ tools for surgical specialties and to study surgeons' and residents' evaluation of surgeons' teaching performance. The results of this study show that the tools underlying the SETQ for surgical specialties provide reliable and valid results and meet current standards.10, 20 Furthermore, the correlation between surgeons' self evaluation and residents' evaluation is low.

Explanation of Results

First, the extremely high response rate of both

Conclusions

The SETQ tools for surgical specialties appear to provide reliable and valid evaluation data for use in surgical departments of Dutch training hospitals. We recommend seeing development and validation as a continuous exercise across different settings. The lack of strong correlations between surgeons' self evaluations and residents' evaluations suggests the need for using external feedback sources in informed self evaluation of surgeons.

Acknowledgments

This study is part of the research project Quality of Clinical Teachers and Residency Training Programs, which is co-financed by the Dutch Ministry of Health, the Academic Medical Center, Amsterdam, and the Faculty of Health and Life Sciences of the University of Maastricht. BCMB, MJMHL, and ORCB are employed by the Academic Medical Center, Amsterdam. OAA is a recipient of a Veni grant (916.96.059) from The Netherlands Organization for Scientific Research (NWO). Funders had no role in study

References (26)

  • K. Ahmed et al.

    Development of a surgical educational research program-fundamental principles and challenges

    J Surg Res

    (2011)
  • M.W. von Websky et al.

    Trainee satisfaction in surgery residency programs: modern management tools ensure trainee motivation and success

    Surgery

    (2011 October 5)
  • J. Sargeant et al.

    The processes and dimensions of informed self-assessment: A conceptual model

    Acad Med

    (2010)
  • E.A. Locke et al.

    Building a practically useful theory of goal setting and task motivationA 35-year odyssey

    Am Psychol

    (2002)
  • D.A. Davis et al.

    Accuracy of physician self-assessment compared with observed measures of competence: A systematic review

    JAMA

    (2006)
  • K.W. Eva et al.

    “I'll never play professional football” and other fallacies of self-assessment

    J Contin Educ Health Prof

    (2008)
  • J. Lockyer et al.

    Feedback data sources that inform physician self-assessment

    Med Teach

    (2011)
  • J. Sargeant et al.

    Features of assessment learners use to make informed self-assessments of clinical performance

    Med Educ

    (2011)
  • J. Sargeant et al.

    “Directed” self-assessment: practice and feedback within a social context

    J Contin Educ Health Prof

    (2008)
  • T.J. Beckman et al.

    How reliable are assessments of clinical teaching?A review of the published instruments

    J Gen Intern Med

    (2004)
  • R. van der Leeuw et al.

    Systematic evaluation of the teaching qualities of obstetrics and gynecology faculty: Reliability and validity of the SETQ tools

    PLoS ONE

    (2011)
  • K.M. Lombarts et al.

    Development of a system for the evaluation of the teaching qualities of anesthesiology faculty

    Anesthesiology

    (2009)
  • O.A. Arah et al.

    New tools for systematic evaluation of teaching qualities of medical faculty: results of an ongoing multi-center survey

    PLoS ONE

    (2011)
  • Cited by (42)

    • Emotional Intelligence and Burnout Related to Resident-Assessed Faculty Teaching Scores

      2021, Journal of Surgical Education
      Citation Excerpt :

      This instrument is based on the Stanford criteria and has been largely adopted in the Netherlands. Boerbach et al. previously evaluated the SETQ as a faculty self-assessment along with resident assessment of faculty teaching ability in surgeons.24 Surgery faculty self-evaluations correlated strongly with the residents’ SETQ scores.

    • The Role of Preoperative Briefing and Postoperative Debriefing in Surgical Education

      2021, Journal of Surgical Education
      Citation Excerpt :

      The System for Evaluation of Teaching Qualities (SETQ) instrument, a reliable and validated survey tool, was distributed in-person at separate resident and faculty meetings. Previously used for evaluating surgical education, the survey assesses 5 domains of teaching: learning climate, professional attitude toward residents, communication of goals, evaluation of residents, and feedback.8 The SETQ tool has been used by learners to evaluate the quality of medical teaching they have received, and we used this tool here to have learners assess, as well as attendings self-assess, the teaching environment within the Department.

    • How learning preferences and teaching styles influence effectiveness of surgical educators

      2021, American Journal of Surgery
      Citation Excerpt :

      The educator’s score is calculated based on the responses to questions for each of the six styles (1- the flexible and all round adaptable teacher, 2- the students centred, sensitive teacher, 3- the official curriculum teacher, 4- the straight facts no nonsense teacher, 5- the big conference teacher, and 6-the one off teacher, definitions in Table 1). The SETQ is a validated tool initially developed for assessment of teaching of anesthesia faculty, that has been adapted for use by surgeons.8,9 The tool requires self- and resident-assessment of each attending and comprises 26 questions in five pre-determined domains (learning climate, professional attitude towards residents, communication of goals, evaluation of residents and feedback).

    View all citing articles on Scopus
    View full text