Testing Quick Response (QR) Codes as an Innovation to Improve Feedback Among Geographically-Separated Clerkship Sites
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
ORIGINAL ARTICLES Testing Quick Response (QR) Codes as an Innovation to Improve Feedback Among Geographically- Separated Clerkship Sites Matthew J. Snyder, DO; Dana R. Nguyen, MD; Jasmyne J. Womack, MPH; Christopher W. Bunt, MD; Katie L. Westerfield, DO; Adriane E. Bell, MD; Christy J.W. Ledford, PhD BACKGROUND AND OBJECTIVES: Collection of feedback regarding medical continued quest for educators to de- student clinical experiences for formative or summative purposes remains a termine an ideal feedback collection challenge across clinical settings. The purpose of this study was to determine method to optimize the learner ex- whether the use of a quick response (QR) code-linked online feedback form perience, and to best describe overall improves the frequency and efficiency of rater feedback. performance.4 METHODS: In 2016, we compared paper-based feedback forms, an online feed- Medical student feedback collec- back form, and a QR code-linked online feedback form at 15 family medicine tion methods vary widely across clerkship sites across the United States. Outcome measures included usability, facilities and programs. Tradition- number of feedback submissions per student, number of unique raters provid- ally, medical students used clini- ing feedback, and timeliness of feedback provided to the clerkship director. cal encounter cards to record and communicate feedback regarding RESULTS: The feedback method was significantly associated with usability, performance on clinical rotations. with QR code scoring the highest, and paper second. Accessing feedback via Written forms have been shown to QR code was associated with the shortest time to prepare feedback. Across improve the quantity and timeliness four rotations, separate repeated measures analyses of variance showed no of feedback, but may have other limi- effect of feedback system on the number of submissions per student or the tations.5-9 For example, written form number of unique raters. accountability poses a challenge in CONCLUSIONS: The results of this study demonstrate that preceptors in the some environments, such as when family medicine clerkship rate QR code-linked feedback as a high usability plat- daily feedback forms do not reach form. Additionally, this platform resulted in faster form completion than paper the educational supervisor, result- or online forms. An overarching finding of this study is that feedback forms ing in limited actionable data at the must be portable and easily accessible. Potential implementation barriers and time of evaluation. the social norm for providing feedback in this manner need to be considered. Technological advancements have prompted medical education to tran- (Fam Med. 2018;50(3):188-94.) sition from paper forms to other doi: 10.22454/FamMed.2018.885946 novel technologies.10 Recent stud- ies regarding medical student feed- back found that the use of electronic A n effective clinical learning barriers to completing timely medi- environment relies heavily on cal student feedback, despite the un- From the Nellis Family Medicine Residency, timely, objective feedback.1,2 derstanding that increased quantity, Nellis Air Force Base, NV (Dr Snyder); Yet, collection of feedback regarding quality, and timeliness of feedback the Uniformed Services University of the medical student clinical experiences leads to improved accuracy and va- Health Sciences (Drs Nguyen, Womack, and Ledford); Medical University of South for formative or summative purpos- lidity in end-of-rotation assessments, Carolina, Charleston (Dr Bunt); Martin es remains a challenge across clini- learner self-reflection, and future Army Community Hospital Family Medicine cal settings.2,3 Residents and faculty professional growth. Current litera- Residency, Fort Benning, GA (Dr Westerfield); Tripler Army Medical Center Family Medicine often cite other clinical priorities as ture review indicates a common and Residency, Honolulu, HI (Dr Bell). 188 MARCH 2018 • VOL. 50, NO. 3 FAMILY MEDICINE
ORIGINAL ARTICLES surveys improved the quality and of the completed form automatically group, preceptors were given a link quantity of feedback to the students, populated an online database, which to a Survey Monkey feedback form, while also improving the perceived was then immediately available to which included the RIME (Report- quality of the clerkship.11-13 Among the clerkship site coordinator to for- er, Interpreter, Manager, Educator) these, the formative value of elec- mulate summative feedback. The pi- model—a framework for assessing tronic feedback methods varied, or in lot intervention demonstrated a 98% learners in clinical settings (Figure some instances, was not measured. increase in the number of feedback 1).15 One study using quick response (QR) entries for rotating medical students In the QR group, a unique QR technology in a surgical residency in the first 4 months of implemen- code was designed for each site, program demonstrated increased tation. which auto-linked the user to the on- longitudinal postprocedure feedback line Survey Monkey feedback form resulting in high feedback quality Multisite Evaluation that could be monitored and collated satisfaction rates.14 The pilot’s success was presented at by the research coordinator. We di- Educational resources to enhance the Uniformed Services University rected sites to create several lami- learning curricula commonly utilize of the Health Sciences (USU) Annu- nated 3x3-inch cards depicting the QR codes, but we are not aware of al Clerkship Site Coordinator Work- QR code, display them in the clinic studies evaluating the utilization of shop in September 2015, to gauge and preceptor room, and distribute QR codes to improve medical student other site coordinators’ interest in them to the rotating clerkship stu- feedback mechanisms. We created a implementing the innovation. The dents. Faculty and resident precep- novel method of collecting daily med- Military Primary Care Research tors were asked to download a free ical student feedback using QR code Network (MPCRN) subsequently QR code reader on their mobile de- technology across multiple clerkship invited its member family medicine vice and scan the QR code after their sites. The purpose of this study was teaching programs to participate in teaching encounter to complete the first to assess the usability of the in- a multisite evaluation of the innova- feedback form. novation, and second, to determine tion. The protocol was reviewed by Evaluation of the innovation in- whether the use of a QR code-linked the USU Human Research Protec- cluded two levels of measurement, online feedback form improves the tions Program Office and determined first at the individual level and frequency and efficiency of preceptor to be nonhuman subjects research, second at the site level. At the in- feedback to medical students in the conducted for quality improvement dividual level, the main outcome family medicine clerkship. purposes. measure was the respondents’ per- The USU clerkship year begins ception of the usability of the feed- Methods annually in January. USU clerkship back system. Usability is the ease Pilot Study students complete a 6-week family of use and learnability of a system, The first author developed a QR medicine rotation at one of 15 mili- regardless of whether it is paper or code intervention to elicit feedback tary family medicine residency sites computer-based. For this evaluation, from preceptors about clerkship stu- across the United States (Table 1). the validated System Usability Scale dents within a single-site teaching The QR code innovation was tested (SUS)16 was administered within the program. QR codes are first created concurrently with four rotations of pre- and postsurvey. On this scale, using a QR code generator website. student clerkships at each site from scores range from 0 to 100. Previ- These QR code generator websites January to July 2016. Following a ous research established that an convert uniform resource locators prospective design, sites were ini- SUS score above 68 would be con- (URLs), or web addresses, text, and tially randomized into control (pa- sidered above average, and there- phone numbers into a black and per-based forms), online form only, fore more usable, and anything less white pixelated barcode in the shape or QR code-linked online form, us- than 68 below average. We addition- of a square. A QR code was gener- ing an online randomizer (random. ally assessed efficiency of preceptor ated using the internet address of org). Four sites did not implement feedback with a single ordinal-level a program-specific online feedback the intervention to which they were question that asked the respondent form. The program provided the randomized. Reasons included site to indicate how much time it took clerkship student a laminated card coordinator preference, technologi- to complete the feedback (Figure 2). that displayed the QR code, which cal barriers, and competing system At the site level of measurement, was then scanned by the preceptor expectations from other university we assessed the frequency and effi- using a free QR reader phone and clerkships. ciency of preceptor feedback to medi- tablet application after each learn- In the paper group, preceptors cal students on the family medicine ing session. The preceptors were en- reported feedback on clerkship stu- clerkship. Frequency measures in- couraged to complete the feedback dents in the form of a physical pa- cluded the number of feedback form at the time verbal feedback was per sheet collected by the on-site submissions per student and the provided to the student. Submission clerkship coordinator. In the online number of unique raters providing FAMILY MEDICINE VOL. 50, NO. 3 • MARCH 2018 189
ORIGINAL ARTICLES Table 1: Clerkship Site Descriptors Location Size* Condition No. of Family No. of Family Medicine Observed for Random Assignment Medicine Residents Residency Faculty Analysis Community Hospitals California (Southern) 39 13 Online Online Nebraska 24 9 Online Online Illinois+ 42 15 -- QR Virginia 45 15 QR Online± North Carolina (Coastal) 30 12 Paper Paper Georgia (Western) 21 7 QR QR Florida (Northwest) 30 10 Online Paper± Florida (Northeast) 36 10 Paper Paper Academic Medical Centers Hawaii 18 6 Online Online California (Northern) 39 13 Online QR± Washington 18 6 QR QR Nevada 36 12 Paper Paper Texas 18 8 QR QR North Carolina (Eastern) 24 8 Paper QR± Georgia (Eastern) 18 15 QR QR *Each site hosts 12 USU clerkship students per year. The Illinois site was the pilot site for this intervention so it was not randomized but continued with the QR innovation. + Four sites did not implement the intervention to which they were randomized. Reasons included: site coordinator preference, technological barriers, ± and competing system expectations from other university clerkships. feedback data. Efficiency was also site-level measures from the clerk- exclusively for USU students while assessed by recording the timeliness ship site coordinators and clerkship preceptors used paper forms for all of feedback submitted to the clerk- director. other students. This variety of feed- ship director. back form and modality provided a Individual-level measures were Results barrier to adoption. Statistical com- collected with two administrations Process Outcomes parisons were calculated based on of an online survey. A baseline sur- The clerkship site coordinator was the final observed conditions: four vey was disseminated to the 15 sites’ charged with clerkship site imple- paper sites, four online sites, and coordinators and program directors mentation of the study protocol and seven QR code sites. in December 2015, prior to any feed- acted as a gatekeeper for the diffu- back system changes. In this dissem- sion of the innovation. Though all Individual-Level Outcomes ination, we asked the coordinators to the sites were initially randomized Of an estimated potential 597 pre- send the survey to their sites’ pre- into groups, some sites adopted the ceptors, 92 completed the posttest ceptors (faculty and residents). Par- innovation more readily than others. survey for a 15.41% response rate. ticipants created a unique code to Adoption was also challenged by lim- Table 2 presents respondent char- link pre- and posttest data. The sur- itations in the wireless network of acteristics. Across conditions, self- vey was open for 2 weeks. The re- clerkship sites. Since only USU med- reported time to complete feedback search team sent a reminder email ical students were expected to use forms was negatively correlated with 1 week before the survey closed. Af- the QR system, feedback modalities usability (Spearman’s rho 92=-.259, ter the fourth clerkship rotation in used by medical students rotating P
ORIGINAL ARTICLES forms via QR code was associated of covariance testing the observed online access (mean=61.79, SD 7.73). with the shortest time to complete feedback system on usability, using Compared to the established SUS (χ2 (8, n=91)=19.40, P
ORIGINAL ARTICLES Table 2: Respondent Characteristics for Individual-Level Measures All Respondents Who Completed Subset of Respondents Who Completed Posttest Surveys Both Pre- and Postsurveys (n=92, Representing 14 of 15 Sites) (n=20, Representing 7 of 15 Sites) Evaluator role Resident 41 (44.6%) Resident 8 (40%) Faculty 51 (55.4%) Faculty 12 (60%) Feedback system used* Paper form 24 (26.1%) Paper form 7 (35%) Online access 31 (33.7%) Online access 7 (35%) QR code access 36 (39.1%) QR code access 6 (30%) * In posttest survey, one respondent indicated verbal response for feedback system. Figure 2: Significant Difference* in Time to Complete Feedback 25 20 15 10 5 0 Less than 5 minutes Between 5 - 10 Between 10 - 15 Between 15 - 20 More than 20 minutes minutes minutes minutes QR code Online access Paper form * Completing feedback forms via QR code was associated with the shortest time to completion (χ2 (8, n=91)=19.40, P
ORIGINAL ARTICLES Table 3: Outcome Measures by Feedback Access QR Code Online Access Paper Form Significance Individual-Level Factor (Posttest Data Only) System usability scale 77.67 75.80 76.03 n.s.* Site-Level Factors Across Four Rotations Average forms completed per student 11.66 10.85 12.63 n.s. Percentage of feedback submitted late 5.6% 4.9% 8.9% n.s. Number of unique preceptors completing forms per 13.86 19.81 18.06 n.s. rotation *Among the 20 cases of pre/post data, using preintervention usability as a covariate, the feedback system was significantly associated with usability (F (2, 16) = 12.68, P
ORIGINAL ARTICLES or as reflecting the views of the US Air Force, US Army, the Uniformed Services University 6. Kogan JR, Shea JA. Implementing feed- 14. Reynolds K, Barnhill D, Sias J, Young A, Polite of the Health Sciences, or the Department of back cards in core clerkships. Med Educ. FG. Use of the QR Reader to Provide Real- Defense at large. 2008;42(11):1071-1079. Time Evaluation of Residents’ Skills Follow- 7. Haghani F, Hatef Khorami M, Fakhari M. Ef- ing Surgical Procedures. J Grad Med Educ. CORRESPONDING AUTHOR: Address cor- fects of structured written feedback by cards 2014;6(4):738-741. respondence to Jasmyne J. Womack, De- on medical students’ performance at Mini 15. Sepdham D, Julka M, Hofmann L, Dobbie A. partment of Family Medicine, A-1038, Clinical Evaluation Exercise (Mini-CEX) in Using the RIME model for learner assessment 4301 Jones Bridge Road, Bethesda, MD an outpatient clinic. J Adv Med Educ Prof. and feedback. Fam Med. 2007;39(3):161-163. 20814. 301-295-9463. Fax: 301-295-3100. 2016;4(3):135-140. 16. Usability.gov. System Usability Scale. http:// Jasmyne.Womack.ctr@usuhs.edu. 8. Bandiera G, Lendrum D. Daily encounter cards www.usability.gov/how-to-and-tools/methods/ facilitate competency-based feedback while system-usability-scale.html. Accessed March References leniency bias persists. CJEM. 2008;10(1):44-50. 13, 2015. 1. Srinivasan M, Hauer KE, Der-Martirosian C, 9. Bennett AJ, Goldenhar LM, Stanford K. 17. Rogers EM. Diffusion of Innovations. 5th ed. Wilkes M, Gesundheit N. Does feedback mat- Utilization of a formative evaluation card New York: Free Press; 2003. ter? Practice-based learning for medical stu- in a psychiatry clerkship. Acad Psychiatry. 18. Phillips RL Jr, Bazemore AW, DeVoe JE, et al. dents after a multi-institutional clinical perfor- 2006;30(4):319-324. A family medicine health technology strategy mance examination. Med Educ. 2007;41(9):857- 10. Ferenchick GS, Solomon D, Foreback J, et al. for achieving the triple aim for US health care. 865. Mobile technology for the facilitation of direct Fam Med. 2015;47(8):628-635. 2. Ende J. Feedback in clinical medical education. observation and assessment of student perfor- JAMA. 1983;250(6):777-781. mance. Teach Learn Med. 2013;25(4):292-299. 3. Elnicki DM, Zalenski D. Integrating medical 11. Tews MC, Treat RW, Nanes M. Increasing students’ goals, self-assessment and preceptor completion rate of an M4 emergency medicine feedback in an ambulatory clerkship. Teach student end-of-shift evaluation using a mobile Learn Med. 2013;25(4):285-291. electronic platform and real-time completion. West J Emerg Med. 2016;17(4):478-483. 4. Greenberg LW. Medical students’ perceptions of feedback in a busy ambulatory setting: a 12. Mooney JS, Cappelli T, Byrne-Davis L, Lums- descriptive study using a clinical encounter den CJ. How we developed eForms: an elec- card. South Med J. 2004;97(12):1174-1178. tronic form and data capture tool to support assessment in mobile medical education. Med 5. Richards ML, Paukert JL, Downing SM, Bord- Teach. 2014;36(12):1032-1037. age G. Reliability and usefulness of clinical encounter cards for a third-year surgical clerk- 13. Stone A. Online assessment: what influences ship. J Surg Res. 2007;140(1):139-148. students to engage with feedback? Clin Teach. 2014;11(4):284-289. 194 MARCH 2018 • VOL. 50, NO. 3 FAMILY MEDICINE
You can also read