دورية أكاديمية

An Analysis of Workplace-Based Assessments for Core Entrustable Professional Activities for Entering Residency: Does Type of Clinical Assessor Influence Level of Supervision Ratings?

التفاصيل البيبلوغرافية
العنوان: An Analysis of Workplace-Based Assessments for Core Entrustable Professional Activities for Entering Residency: Does Type of Clinical Assessor Influence Level of Supervision Ratings?
المؤلفون: Shuford A, Carney PA, Ketterer B, Jones RL, Phillipi CA, Kraakevik J, Hasan R, Moulton B, Smeraglio A
المصدر: Academic medicine : journal of the Association of American Medical Colleges [Acad Med] 2024 Aug 01; Vol. 99 (8), pp. 904-911. Date of Electronic Publication: 2024 Mar 18.
نوع المنشور: Journal Article
اللغة: English
بيانات الدورية: Publisher: Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins Country of Publication: United States NLM ID: 8904605 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1938-808X (Electronic) Linking ISSN: 10402446 NLM ISO Abbreviation: Acad Med Subsets: MEDLINE
أسماء مطبوعة: Publication: Philadelphia, PA : Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins
Original Publication: [Philadelphia, Pa. : Hanley & Belfus, c1989-
مواضيع طبية MeSH: Internship and Residency* , Clinical Competence*/statistics & numerical data , Clinical Competence*/standards , Workplace* , Educational Measurement*/methods , Educational Measurement*/statistics & numerical data, Humans ; Students, Medical/statistics & numerical data ; Competency-Based Education/methods ; Oregon
مستخلص: Purpose: The authors describe use of the workplace-based assessment (WBA) coactivity scale according to entrustable professional activities (EPAs) and assessor type to examine how diverse assessors rate medical students using WBAs.
Method: A WBA data collection system was launched at Oregon Health and Science University to visualize learner competency in various clinical settings to foster EPA assessment. WBA data from January 14 to June 18, 2021, for medical students (all years) were analyzed. The outcome variable was level of supervisor involvement in each EPA, and the independent variable was assessor type.
Results: A total of 7,809 WBAs were included. Most fourth-, third-, and second-year students were assessed by residents or fellows (755 [49.5%], 1,686 [48.5%], and 918 [49.9%], respectively) and first-year students by attending physicians (803 [83.0%]; P < .001). Attendings were least likely to use the highest rating of 4 (1 was available just in case; 2,148 [56.7%] vs 2,368 [67.7%] for residents; P < .001). Learners more commonly sought WBAs from attendings for EPA 2 (prioritize differential diagnosis), EPA 5 (document clinical encounter), EPA 6 (provide oral presentation), EPA 7 (form clinical questions and retrieve evidence-based medicine), and EPA 12 (perform general procedures of a physician). Residents and fellows were more likely to assess students on EPA 3 (recommend and interpret diagnostic and screening tests), EPA 4 (enter and discuss orders and prescriptions), EPA 8 (give and receive patient handover for transitions in care), EPA 9 (collaborate as member of interprofessional team), EPA 10 (recognize and manage patient in need of urgent care), and EPA 11 (obtain informed consent).
Conclusions: Learners preferentially sought resident versus attending supervisors for different EPA assessments. Future research should investigate why learners seek different assessors more frequently for various EPAs and if assessor type variability in WBA levels holds true across institutions.
(Copyright © 2024 Written work prepared by employees of the Federal Government as part of their official duties is, under the U.S. Copyright Act, a “work of the United States Government” for which copyright protection under Title 17 of the United States Code is not available. As such, copyright does not extend to the contributions of employees of the Federal Government.)
References: Association of American Medical Colleges. Core Entrustable Professional Activites for Entering Residency: Faculty and Learners’ Guide. Association of American Medical Colleges; 2014. Accessed February 19, 2024. https://store.aamc.org/downloadable/download/sample/sample&#95;id/66/%20 .
Obeso V, Grbic D, Emery M, et al. Core entrustable professional activities (EPAs) and the transition from medical school to residency: the postgraduate year one resident perspective. Med Sci Educ. 2021;31(6):1813–1822.
Jafri L, Siddiqui I, Khan AH, et al. Fostering teaching-learning through workplace based assessment in postgraduate chemical pathology residency program using virtual learning environment. BMC Med Educ. 2020;20(1):383.
Kogan JR, Hatala R, Hauer KE, Holmboe E. Guidelines: the do’s, don’ts and don’t knows of direct observation of clinical skills in medical education. Perspect Med Educ. 2017;6(5):286–305.
Cutrer WB, Russell RG, Davidson M, Lomis KD. Assessing medical student performance of entrustable professional activities: a mixed methods comparison of co-activity and supervisory scales. Med Teach. 2020;42(3):325–332.
Ryan MS, Khan AR, Park YS, et al. Workplace-based entrustment scales for the core EPAs: a multisite comparison of validity evidence for two proposed instruments using structured vignettes and trained raters. Acad Med. 2022;97(4):544–551.
Swanwick T, Chana N. Workplace-based assessment. Br J Hosp Med. 2009;70(5):290–293.
Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676–682.
Kelleher M, Kinnear B, Sall D, et al. A reliability analysis of entrustment-derived workplace-based assessments. Acad Med. 2020;95(4):616–622.
Prediger S, Fürstenberg S, Berberat PO, Kadmon M, Harendza S. Interprofessional assessment of medical students’ competences with an instrument suitable for physicians and nurses. BMC Med Educ. 2019;19(1):46.
Nesbitt A, Baird F, Canning B, Griffin A, Sturrock A. Student perception of workplace-based assessment. Clin Teach. 2013;10(6):399–404.
Gauthier S, Braund H, Dalgarno N, Taylor D. Assessment-seeking strategies: navigating the decision to initiate workplace-based assessment. Teach Learn Med. 2023;1–10.
Carraccio C, Englander R, Holmboe ES, Kogan JR. Driving care quality. Acad Med. 2016;91(2):199–203.
Zaidi NLB, Kreiter CD, Castaneda PR, et al. Generalizability of competency assessment scores across and within clerkships: how students, assessors, and clerkships matter. Acad Med. 2018;93(8):1212–1217.
Yeates P, O’Neill P, Mann K, Eva K. Seeing the same thing differently: mechanisms that contribute to assessor differences in directly-observed performance assessments. Adv Health Sci Educ. 2013;18(3):325–341.
Moonen-van Loon JM, Overeem K, Govaerts MJ, Verhoeven BH, van der Vleuten CP, Driessen EW. The reliability of multisource feedback in competency-based assessment programs: the effects of multiple occasions and assessor groups. Acad Med. 2015;90(8):1093–1099.
Kavic M. Competency and the six core competencies. JSLS. 2002;6(2):95–97.
Rekman J, Hamstra SJ, Dudek N, Wood T, Seabrook C, Gofton W. A new instrument for assessing resident competence in surgical clinic: the Ottawa clinic assessment tool. J Surg Educ. 2016;73(4):575–582.
Hasan R, Phillipi C, Smeraglio A, et al. Implementing a real-time workplace-based assessment data collection system across an entire medical school’s clinical learning environment. MedEdPublish (2016). 2021;10:22. doi:10.15694/mep.2021.000022.1. (PMID: 10.15694/mep.2021.000022.1)
Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE): a tool to assess surgical competence. Acad Med. 2012;87(10):1401–1407.
Riese A, Rappaport L, Alverson B, Park S, Rockney RM. Clinical performance evaluations of third-year medical students and association with student and evaluator gender. Acad Med. 2017;92(6):835–840.
Kennedy TJT, Lingard L, Baker GR, Kitchen L, Regehr G. Clinical oversight: conceptualizing the relationship between supervision and safety. J Gen Intern Med. 2007;22(8):1080–1085.
Gilchrist T, Hatala R, Gingerich A. A collective case study of supervision and competence judgments on the inpatient internal medicine ward. Perspect Med Educ. 2021;10(3):155–162.
Olmos-Vega F, Dolmans D, Donkers J, Stalmeijer RE. Understanding how residents’ preferences for supervisory methods change throughout residency training: a mixed-methods study. BMC Med Educ. 2015;15:177.
Cook DA, Dupras DM, Beckman TJ, Thomas KG, Pankratz VS. Effect of rater training on reliability and accuracy of Mini-CEX scores: a randomized, controlled trial. J Gen Intern Med. 2009;24(1):74–79.
Gingerich A, Regehr G, Eva KW. Rater-based assessments as social judgments: rethinking the etiology of rater errors. Acad Med. 2011;86(10 suppl):S1–S7.
Colbert CY, French JC, Herring ME, Dannefer EF. Fairness: the hidden challenge for competency-based postgraduate medical education programs. Perspect Med Educ. 2017;6(5):347–355.
Boscardin CK, Wijnen-Meijer M, Cate OT. Taking rater exposure to trainees into account when explaining rater variability. J Grad Med Educ. 2016;8(5):726–730.
Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ. 2011;45(6):560–569.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29(9–10):855–871.
تواريخ الأحداث: Date Created: 20240318 Date Completed: 20240802 Latest Revision: 20240802
رمز التحديث: 20240802
DOI: 10.1097/ACM.0000000000005691
PMID: 38498305
قاعدة البيانات: MEDLINE
الوصف
تدمد:1938-808X
DOI:10.1097/ACM.0000000000005691