Swipe om te navigeren naar een ander artikel
commentary by: Sebastian Uijtdehaage https://doi.org/10.1007/s40037-018-0485-y
Competency-based education (CBE) is now pervasive in health professions education. A foundational principle of CBE is to assess and identify the progression of competency development in students over time. It has been argued that a programmatic approach to assessment in CBE maximizes student learning. The aim of this study is to investigate if programmatic assessment, i. e., a system of assessment, can be used within a CBE framework to track progression of student learning within and across competencies over time.
Three workplace-based assessment methods were used to measure the same seven competency domains. We performed a retrospective quantitative analysis of 327,974 assessment data points from 16,575 completed assessment forms from 962 students over 124 weeks using both descriptive (visualization) and modelling (inferential) analyses. This included multilevel random coefficient modelling and generalizability theory.
Random coefficient modelling indicated that variance due to differences in inter-student performance was highest (40%). The reliability coefficients of scores from assessment methods ranged from 0.86 to 0.90. Method and competency variance components were in the small-to-moderate range.
The current validation evidence provides cause for optimism regarding the explicit development and implementation of a program of assessment within CBE. The majority of the variance in scores appears to be student-related and reliable, supporting the psychometric properties as well as both formative and summative score applications.
Carraccio C, Englander R. From Flexner to competencies: reflections on a decade and the journey ahead. Acad Med. 2013;88:1067–73. CrossRef
Epstein RM. Assessment in medical education. N Eng J Med. 2007;356:387–96. CrossRef
Van Der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–17. CrossRef
Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34:205–14. CrossRef
Bok HGJ. Competency-based veterinary education: an integrative approach to learning and assessment in the clinical workplace. Perspect Med Educ. 2015;4:86–9. CrossRef
Van der Leeuw RM, Teunissen PW, Van der Vleuten CPW. Broadening the scope of feedback to promote its relevance to workplace learning. Acad Med. 2018;93:556–9. CrossRef
Draft 2018 Consensus Framework for Good Assessment, Ottawa conference. https://amee.org/getattachment/home/Draft-2018-Consensus-Framework-for-Good-Assessment.pdf. Accessed July 10 th, 2018.
Schuwirth LWT, Van Der Vleuten CPM. Programmatic assessment and Kane’s validity perspective. Med Educ. 2012;46:38–48. CrossRef
Kane MT. Validation. In: Brennan RI, editor. Educational Measurement. Westport: ACE, Praeger; 2006. pp. 7–64.
Kane MT. Validating the Interpretations and Uses of Test Scores. J Educ Meas. 2013;50:1–73. CrossRef
Mislevy RJ. How developments in psychology and technology challenge validity argumentation. J Educ Meas. 2016;53:265–92. CrossRef
Bok HGJ, Jaarsma ADC, Teunissen PW, Van Der Vleuten CPM, Van Beukelen P. Development and validation of a veterinary competency framework. J Vet Med Educ. 2011;38:262–9. CrossRef
Bok HGJ, Teunissen PW, Favier RP, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013;13:123:1–9.
Bok HGJ, Teunissen PW, Boerboom TBB, et al. International survey of veterinarians to assess the importance of competencies in professional practice and education. J Am Vet Med Assoc. 2014;245:906–13. CrossRef
Favier RP, Vernooij JCM, Jonker FH, Bok HGJ. Inter-rater reliability of grading undergraduate portfolios in veterinary medical education. J Vet Med Educ. (In press).
Singer JD, Willett JB. Applied longitudinal data analysis: modeling change and event occurrence. London: Oxford University Press; 2003. CrossRef
Peugh JL, Heck RH. Conducting three-level longitudinal analyses. J Early Adolesc. 2017;37:7–58. CrossRef
Dreyfus SE, Dreyfus HKA. Five-stage model of the mental activities involved in directed skill acquisition. Berkeley: University of California; 1980. pp. 1–18. CrossRef
Pusic MV, Boutis K, Hatala R, Cook DA. Learning curves in health professions education. Acad Med. 2015;90:1034–42. CrossRef
West BT, Welch KB, Galecki AT. ‘Random coefficient models for longitudinal data: the autism example’ in linear mixed models: a practical guide using statistical software. 2nd ed. New York: CRC Press, Taylor & Francis; 2014. pp. 249–305.
Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010;376:1923–58. CrossRef
McGaghie WC. Mastery learning: it is time for medical education to join the 21st century. Acad Med. 2015;90:1438–41. CrossRef
Campbell JP. An overview of the army selection and classification project (project A). Pers Psychol. 1990;43:231–9. CrossRef
Hoffman B, Lance CE, Bynum B, Gentry WA. Rater source effects are alive and well after all. Pers Psychol. 2010;63:119–51. CrossRef
Borman WC. Effects of instructions to avoid halo error on reliability and validity of performance evaluation ratings. J Appl Psychol. 1975;60:556–60. CrossRef
Ten Cate O, Chen HC, Hoff RG, et al. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE guide no. 99. Med Teach. 2015;37:983–1002. CrossRef
- Validity evidence for programmatic assessment in competency-based education
Harold G. J. Bok
Lubberta H. de Jong
Kent G. Hecker
- Bohn Stafleu van Loghum