TY - JOUR
T1 - Key considerations in planning and designing programmatic assessment in competency-based medical education
AU - on behalf of the ICBME Collaborators
AU - Ross, Shelley
AU - Hauer, Karen E.
AU - Wycliffe-Jones, Keith
AU - Hall, Andrew K.
AU - Molgaard, Laura
AU - Richardson, Denyse
AU - Oswald, Anna
AU - Bhanji, Farhan
N1 - Publisher Copyright:
© 2021 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2021
Y1 - 2021
N2 - Programmatic assessment as a concept is still novel for many in clinical education, and there may be a disconnect between the academics who publish about programmatic assessment and the front-line clinical educators who must put theory into practice. In this paper, we clearly define programmatic assessment and present high-level guidelines about its implementation in competency-based medical education (CBME) programs. The guidelines are informed by literature and by lessons learned from established programmatic assessment approaches. We articulate five steps to consider when implementing programmatic assessment in CBME contexts: articulate the purpose of the program of assessment, determine what must be assessed, choose tools fit for purpose, consider the stakes of assessments, and define processes for interpreting assessment data. In the process, we seek to offer a helpful guide or template for front-line clinical educators. We dispel some myths about programmatic assessment to help training programs as they look to design–or redesign–programs of assessment. In particular, we highlight the notion that programmatic assessment is not ‘one size fits all’; rather, it is a system of assessment that results when shared common principles are considered and applied by individual programs as they plan and design their own bespoke model of programmatic assessment for CBME in their unique context.
AB - Programmatic assessment as a concept is still novel for many in clinical education, and there may be a disconnect between the academics who publish about programmatic assessment and the front-line clinical educators who must put theory into practice. In this paper, we clearly define programmatic assessment and present high-level guidelines about its implementation in competency-based medical education (CBME) programs. The guidelines are informed by literature and by lessons learned from established programmatic assessment approaches. We articulate five steps to consider when implementing programmatic assessment in CBME contexts: articulate the purpose of the program of assessment, determine what must be assessed, choose tools fit for purpose, consider the stakes of assessments, and define processes for interpreting assessment data. In the process, we seek to offer a helpful guide or template for front-line clinical educators. We dispel some myths about programmatic assessment to help training programs as they look to design–or redesign–programs of assessment. In particular, we highlight the notion that programmatic assessment is not ‘one size fits all’; rather, it is a system of assessment that results when shared common principles are considered and applied by individual programs as they plan and design their own bespoke model of programmatic assessment for CBME in their unique context.
KW - Assessment (general)
KW - assessment (clinical)
KW - phase of education (general)
KW - profession (General)
KW - profession (Medicine)
UR - http://www.scopus.com/inward/record.url?scp=85107440613&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107440613&partnerID=8YFLogxK
U2 - 10.1080/0142159X.2021.1925099
DO - 10.1080/0142159X.2021.1925099
M3 - Article
C2 - 34061700
AN - SCOPUS:85107440613
SN - 0142-159X
VL - 43
SP - 758
EP - 764
JO - Medical Teacher
JF - Medical Teacher
IS - 7
ER -