Elsevier

Academic Pediatrics

Volume 11, Issue 5, September–October 2011, Pages 394-402
Academic Pediatrics

Education
Observation of Resident Clinical Skills: Outcomes of a Program of Direct Observation in the Continuity Clinic Setting

https://doi.org/10.1016/j.acap.2011.02.008Get rights and content

Abstract

Objective

To assess the feasibility of a new multi-institutional program of direct observation and report what faculty observed and the feedback they provided.

Methods

A program of direct observation of real patient encounters was implemented in 3 pediatric residency programs using a structured clinical observation (SCO) form to document what was observed and the feedback given. Outcome variables included the number of observations made, the nature of the feedback provided, resident attitudes about direct observation before and after implementation, and the response of the faculty.

Results

Seventy-nine preceptors and 145 residents participated; 320 SCO forms were completed. Faculty provided feedback in 4 areas: content, process of the encounter, patient-centered attitudes and behaviors, and interpersonal skills. Feedback was 85% specific and 41% corrective. Corrective feedback was most frequent for physical examination skills. After program implementation, residents reported an increase in feedback and a decrease in discomfort with direct observation; in addition, they agreed that direct observation was a valuable component of their education. Participation rates among faculty were high.

Conclusions

Direct observation using SCOs results in timely and specific feedback to residents about behaviors rarely observed in traditional precepting models. Resident competency in these clinical skill domains is critical for assessing, diagnosing, and managing patients. The SCO methodology is a feasible way to provide formative feedback to residents about their clinical skills.

Section snippets

Setting

We implemented the program in a pediatric residency continuity clinic setting where residents work with 1 or 2 faculty members in a longitudinal clinical experience, and would be expected to have developed a trusting relationship in which to be observed and receive feedback. Implementation took place at 3 institutions: the Jefferson-duPont, Children’s National Medical Center (CNMC), and National Capital Consortium Military (NCC) Pediatrics Residency Programs during 2005 and 2006. CNMC has 3

Feasibility

During the implementation year, 67% of faculty at the 3 residency programs used the SCO method to observe residents. Among hospital-based preceptors, 80% of faculty used the SCO form at least once, with 100% of Jefferson-duPont faculty, 86% of NCC faculty, and 63% of CNMC faculty using the form. The CNMC program had a lower rate of participation as a result of the addition of a hospital-based continuity clinic based in another facility shortly before the direct observation program was begun on

Discussion

In this study, we analyzed the feasibility of implementing a program of direct observation of residents interacting with their patients in continuity clinic, the feedback given to residents after direct observation using the SCO tool, and the resident attitudes about being observed. Implementation of this method is feasible in training programs of different sizes and teaching settings. The feedback provided was on a broad range of skills and was mainly very specific. Although many tools for

Conclusions

Use of SCOs results in timely and specific feedback to residents about domains rarely observed in traditional precepting models that do not include direct observation. Resident competency in these domains is critical for assessment, diagnosis, and management of patients. The SCO methodology is an efficient way to provide formative feedback to residents about their clinical skills and implementation of a program of direct observation is feasible at both hospital-based and community-based

Acknowledgment

This work was supported by grants from Region IV Academic Pediatrics Association, Children’s National Medical Center Research Advisory Committee, and from the Association of Pediatric Program Directors Special Project Award. We are grateful for the support of the continuity clinic preceptors and residents at National Capital Consortium Military Pediatrics Residency program (NCC), Jefferson Medical College/duPont Hospital for Children Pediatric Residency Program, and Children’s National Medical

References (32)

  • R.A. Fox et al.

    A study of pre-registration house officers’ clinical skills

    Med Educ

    (2000)
  • J.R. Meuleman et al.

    Evaluating the interview performance of internal medicine interns

    Acad Med

    (1989)
  • G.L. Noel et al.

    How well do internal medicine faculty members evaluate the clinical skills of residents?

    Ann Intern Med

    (1992)
  • A. Kalet et al.

    How well do faculty evaluate the interviewing skills of medical students?

    J Gen Intern Med

    (1992)
  • A.R. Pulito et al.

    What do faculty observe of medical students’ clinical performance?

    Teach Learn Med

    (2006)
  • A.L. Wendling

    Assessing resident competency in an outpatient setting

    Fam Med

    (2004)
  • Cited by (18)

    • Teaching and assessing operative skills: From theory to practice

      2017, Current Problems in Surgery
      Citation Excerpt :

      Entrustability of surgical trainees is dependent on direct observation of operative and patient care skills. Without witnessing the technical performance, anatomical knowledge, and leadership of a trainee, to name a few qualities contributing to trust and responsibility, there would be a chasm in the evidence needed to support a faculty surgeon’s entrustment decision.85 Pair this with an operative performance assessment that a faculty member completes for a trainee which captures that the trainee demonstrates hesitation, has a tremor, or causes tissue damage, and the feedback from the tool simply states “more practice.”

    • Targeting Simulation-Based Assessment for the Pediatric Milestones: A Survey of Simulation Experts and Program Directors

      2016, Academic Pediatrics
      Citation Excerpt :

      Tobler et al24 showed that a simulation-based workshop, including standardized parents, improved pediatric residents' skills in having difficult conversations as evaluated by the trainees, physician expert reviewers, and actual parents. Debriefing after simulated educational experiences is used to explore learners' cognitive frameworks and shared mental models25–27 and is an ideal venue to assess self-insight (PBL1) and other internal domains of competency (including recognition and acceptance of ambiguity, PROF6) identified as difficult to assess by our survey.20,21,23,25,28–30 Responses by simulation experts appear to be grounded in practicality as well as in existing frameworks for simulation education described in the literature.

    View all citing articles on Scopus

    The authors have no conflicts of interest to disclose.

    View full text