EHR Usability Test Report of My Vision Express, 2018 - UL.com
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
EHR Usability Test Report of My Vision Express, 2018 Report based on NISTIR 7742 Customized Common Industry Format Template for Electronic Health Record Usability Testing My Vision Express, 2018 Date of Usability Test: 10/31/2017 Date of Report: 11/1/2017 Report Prepared By: Dr. Disha Biala & Dr. Jasmine Pattanayak 1
TABLE OF CONTENTS 1. Executive Summary…………………………………………..3 2. Introduction…………………………………………….....…16 3. Method 3.1Participants.................................................................16 3.2 Study Design.............................................................17 3.3 Tasks………………………………………………..18 3.4 Procedure…………………………………………...19 3.5 Test Location……………………….………...…….20 3.6 Test Environment……………………….……….…20 3.7 Test Forms and Tools…………….………..……….21 3.8 Participant Instructions…………….……….………21 3.9 Usability Metrics…………………………………....22 3.10 Data Scoring……………………………………….23 4. Results 4.1 Data Analysis………………………………………..24 4.2 Discussion of the findings…………………………...25 4.3 Major Findings………………………………………30 4.4 Areas for Improvement........................................…...30 5. Appendices 5.1 Appendix 1- Participant Demographics……………...31 5.2 Appendix 2- Informed Consent Form………………..33 5.3 Appendix 3- Non- Disclosure Agreement……………35 5.3 Appendix 4- Usability Instructions…….……………..36 5.4 Appendix 4- Safety Enhanced Design Test Scenarios...40 5.5 Appendix 5- Post-Test Questionnaire………………….68 2
EXECUTIVE SUMMARY A usability test of My Vision Express, 2018, an ambulatory electronic health record software; was conducted from 10/25/2017 to 10/31/2017 from our office in India by My Vision Express Software employees. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the My Vision Express Software. During the usability test, 10 healthcare providers and/or other healthcare personnel matching the target demographic criteria and representing a cross section of our typical user base, served as participants and used the EHR under test (EHRUT) in simulated, but representative tasks. This study collected performance data on a series of tasks related to safety-enhanced design, typically conducted on an EHR. The tasks are correlated to the twelve certification criteria in 45 CFR Part 170 Subpart C of the Health Information Technology: 2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base Electronic Health Record (EHR) Definition, and ONC Health IT Certification Program Modifications: • §170.315(a)(1) Computerized Provider Order Entry – Medications • §170.315(a)(2) Computerized Provider Order Entry – Laboratory • §170.315(a)(3) Computerized Provider Order Entry – Diagnostic Imaging • §170.315(a)(4) Drug-Drug, Drug-Allergy Interactions Checks • §170.315(a)(5) Demographics • §170.315(a)(6) Problem List • §170.315(a)(7) Medication List • §170.315(a)(8) Medication Allergy List • §170.315(a)(9) Clinical Decision Support • §170.315(a)(14) Implantable Device List • §170.315(b)(2) Clinical Information Reconciliation and Incorporation During the 60 minutes, one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent & Non- Disclosure Agreement, shown in Appendix 2 & 3 respectively. The participants were instructed that they could withdraw at any time. All participants were current users of My Vision Express, so they had prior experience with some version of the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator 3
timed the test and, along with the data logger recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task except in areas of elucidation or paraphrasing of instructions when the directions seemed uncertain. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s satisfaction rating of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of My Vision Express, 2018. Following is a summary of the performance and rating data collected on My Vision Express, 2018: Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds N (#) Record a CPOE medication Likert Scale order- Tab (a)(1).(1) Tylenol 325mg 10 100 0 12.1 12 44 7 8 36 0.8 26 4.7 0.48 Once Daily for One month RxNorm code- 209387 4
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Change a CPOE N (#) medication Likert Scale (a)(1).(2) order- Tab Hydrochlorothiaz 10 100 0 7 7 38 3 7 31 0 0 4.5 0.71 ide- 25mg OD FOR 1month RxNorm- 310798 Display changed CPOE Likert Scale medication (a)(1).(3) order- Tab 10 100 0 1 1 3 1 1 2 0 0 4.9 0.32 Hydrochlorothiaz ide- 25mg OD FOR 1month RxNorm- 310798 Record a CPOE Likert Scale lab order- (a)(2).(1) Creatinine 24H 10 100 0 7 7 27 6 11 16 0 0 4.5 0.97 renal clearance panel, LOINC® code: 34555-3 Likert Scale Change a CPOE (a)(2).(2) lab order- Cell 10 100 0 3 3 9 1 2 7 0 0 4.7 0.48 count panel, LOINC- 34556-1 Likert Scale Display changed (a)(2).(3) CPOE lab order- 10 100 0 1 1 3 1 2 1 0 0 4.8 0.63 Cell count panel, LOINC- 34556-1 Likert Scale Record a CPOE (a)(3).(1) image order- 10 100 0 7.1 7 23 3 6 17 1.4 44 4.4 0.97 Fundus Photo CPT- 92250 Change the Likert Scale (a)(3).(2) CPOE image order- Corneal 10 100 0 3 3 12 2 4 8 0 0 4.7 0.48 Topography CPT- 92025 5
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Display changed N (#) Likert Scale (a)(3).(3) CPOE image order- Corneal 10 100 0 1 1 2 0 1 1 0 0 4.8 0.63 Topography CPT- 92025 Using CPOE, trigger a drug- drug interaction by entering a Likert Scale (a)(4).(1) new medication - Amoxicillin 500 10 100 0 5 5 44 9 3 41 0 0 4.5 0.53 MG; RxNorm code: 262073, Orally, Twice a day for 30 days. {2 capsules} Using CPOE, trigger a drug- allergy interaction by Likert Scale entering a new (a)(4).(2) medication order 10 100 0 4 4 30 3 10 20 0 0 4.6 0.52 - Glucosamine {Glucosamine Sulphate} 100MG Capsule, Orally, Once a day for 15 days. Likert Scale Adjust the (a)(4).(3) severity level of 10 100 0 2 2 15 3 7 8 0 0 4.4 1.26 a displayed drug- drug interaction Record demographics • Record Date of Birth, Likert Scale (a)(5).(1) 06/02/1965 • Record Sex, 10 100 0 16.2 16 66 3 4 62 1.3 41 4 1.15 Female • Record Race, American Indian (OMB- 1002-5) 6
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds • Record N (#) Ethnicity, Hispanic Latino (OMB- 2135-2) • Record Preferred Language, English • Record Sexual Orientation, Bisexual • Record Gender Identity, Male- to-Female (MTF)/Transgen der Change demographics • Change Date of Birth, 06/02/1987 • Change Sex, Male • Change Race, White (OMB- 2106-3) • Change Ethnicity, Likert Scale (a)(5).(2) Declined to 10 100 0 14.1 14 49 5 9 40 0.7 22 4.4 0.97 specify • Change Preferred Language, Spanish • Change Sexual Orientation, Straight/Heterose xual • Change Gender Identity, Identifies as Male 7
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Display the N (#) patient’s changed preferred Likert Scale (a)(5).(3) language, date of birth, birth sex, 10 100 0 1 1 3 0 1 2 0 0 4.9 0.32 race, ethnicity, sexual orientation, gender identity Record a problem to the problem list- Likert Scale Non proliferative (a)(6).(1) diabetic 10 100 0 7 7 19 1 4 15 0 0 3.8 1.55 retinopathy (disorder). ICD10- E11.329 SNOMED- 390834004 Change a problem on the Likert Scale problem list- (a)(6).(2) Resolve Internal 10 100 0 3 3 4 1 2 2 0 0 4.5 0.97 hordeolum ICD10- H00.021 SNOMED- 414521009 Display the active problem list Likert Scale a. (Active) Non (a)(6).(3) proliferative 10 100 0 1 1 3 0 2 1 0 0 4.8 0.42 diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 Display the historical problem list Likert Scale a. (Active) Non (a)(6).(4) proliferative 10 100 0 1 1 3 1 2 1 0 0 4.5 1.27 diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 8
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds b. (Resolved) N (#) Internal hordeolum ICD10- H00.029 SNOMED- 414521009 Record a medication to the Likert Scale medication list- (a)(7).(1) Keflex 750 mg, 10 100 0 11.1 11 51 5 6 45 0.9 28 4.8 0.42 Oral, twice daily for 15 days. RxNorm Code- 637175 Change a medication on Likert Scale (a)(7).(2) the medication list- Discontinue 10 100 0 6 6 24 3 4 20 0 0 4.6 0.70 Diamox Sequels 500 mg Rx Norm code- 876439 Display the active medication list- Likert Scale (a)(7).(3) a. (Active) Keflex 750 mg, 10 100 0 1 1 3 1 2 1 0 0 4.8 0.42 Oral, twice daily for 15 days. RxNorm Code- 637175 Display the historical medication list a. (Active) Keflex 750 mg, Likert Scale Oral, twice daily (a)(7).(4) for 15 days. RxNorm Code- 10 100 0 1 1 3 1 2 1 0 0 4.9 0.32 637175 b. (Discontinued) Diamox Sequels 500 mg Rx Norm code- 876439 9
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Record a N (#) medication Likert Scale allergy- Rx (a)(8).(1) Norm Code- 10 100 0 11 11 30 4 7 23 0 0 4.4 0.53 202589 Cytoxan, Reaction- diarrhea, Severity- Severe Change a medication allergy- Inactive Likert Scale (a)(8).(2) Codeine; RxNorm code: 10 100 0 4 4 7 2 3 4 0 0 4.6 0.52 2670; Reaction: Diarrhea and Abdominal Pain, Severity- Severe Display the active medication allergy list Likert Scale (a)(8).(3) a. (Active) Rx Norm Code- 10 100 0 1 1 2 1 1 1 0 0 4.9 0.32 202589 Cytoxan, Reaction- diarrhea, Severity- Severe Display the historical medication allergy list a. (Active) Rx Norm Code- 202589 Cytoxan, Likert Scale (a)(8).(4) Reaction- diarrhea, 10 100 0 1 1 2 0 0 2 0 0 4.9 0.32 Severity- Severe b. (Inactive) RxNorm code: 2670 Codeine; Reaction: diarrhea and Abdominal Pain, Severity- Severe 10
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Add a CDS N (#) intervention and/or reference resource for each of the required elements,- a. Problem List- Non proliferative diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 Likert Scale (a)(9).(1) b. Medication Allergy List (Rx 19 10 100 0 65.2 65 242 23 43 0.3 9 3.5 1.58 Norm Code- 9 202589 Cytoxan) c. Medication list (RxNorm Code- 637175 Keflex 750 mg) d. Demographics- (Ethnicity- Hispanic or Latino) e. Combination- Age>50 and Vitals- BMI- >25 Trigger the CDS interventions/res ources added using the applicable data elements from Likert Scale each of the (a)(9).(2) required 10 10 100 0 37.1 37 113 7 13 0.3 9 4.4 0.70 elements- 0 Problem List- Non proliferative diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 11
Standard Deviation for Task Time Task Rating Standard Deviation Task Success- Standard Deviation % Task Errors-Standard Deviation % Task Time Deviation-Mean Observed Observed # of Steps-Mean Task Success - Mean (%) Optimal number of steps Task Errors - Mean (%) Optimal Time for Tasks Task Rating Scale Type Task Rating - Mean Task Description Mean Task Time Task Identifier Seconds Medication N (#) Allergy List (Rx Norm Code- 202589 Cytoxan) Medication list (RxNorm Code- 637175 Keflex 750 mg) Demographics- (Ethnicity- Hispanic or Latino) Combination- Age>50 and Vitals- BMI- >25 View the intervention/reso urce information using the Likert Scale (a)(9).(3) Infobutton standard for data 10 100 0 3 3 10 1 2 8 0 0 4.9 0.32 elements in the problem list, medication list, and demographics Trigger the CDS interventions/res ources based on data elements in the problem list, Likert Scale medication list, (a)(9).(4) and medication 10 100 0 10 10 76 5 6 70 0 0 4.9 0.32 allergy list by incorporating patient information from a transition of care/referral summary. Access the following attributes for the Likert Scale (a)(9).(5) Problem list triggered CDS 10 100 0 1 1 2 0 0 2 0 0 5 0.00 interventions/res ources: developer and funding source. 12
(b)(2).(2) (b)(2).(1) (a)(14).(3) (a)(14).(2) (a)(14).(1) Task Identifier Task Description (UDI) device conduct attributes medication CCDA and description, information CCDA with Access UDI, Record UDI- Incorporate a allergies, and of the Unique identifiers and reconciled data currently in the Generate a new problems in the CCDA with the 213B1(21)1234 patient’s record. the medications, reconciliation of (01)1088452106 17)150707(10)A Device Identifier 2856(11)141231( Change the status 10 10 10 10 10 N (#) Task Success - Mean (%) 100 100 100 100 100 0 0 0 0 0 Task Success- Standard Deviation % 4 1 5 6 11 Observed # of Steps-Mean 4 1 5 6 11 Optimal number of steps 2 8 82 18 84 Mean Task Time 5 4 1 0 12 Standard Deviation for Task Time Task Time Deviation-Mean Observed 9 9 1 2 30 Seconds 9 1 6 73 54 Optimal Time for Tasks 0 0 0 0 0 Task Errors - Mean (%) 0 0 0 0 0 Task Errors-Standard Deviation % Likert Scale Likert Scale Likert Scale Likert Scale Likert Scale Task Rating Scale Type 5 Task Rating - Mean 4.7 4.7 4.6 3.5 Task Rating Standard Deviation 0.67 0.48 0.00 0.97 1.27 13
Overall Ease of Use and Satisfaction of Tasks Performed During the Study Ease of Use • Overall the subjective ease of use mean was 3.47 out of 5. • Overall the percentage of ease of use was 69.4% Satisfaction • Overall the subjective satisfaction mean was 4.5 out of 5. • Overall the percentage satisfaction of use was 90% Major Findings During the testing process, it was observed that the participants performed the majority of tasks within expected number of steps. However, it was revealed that some of the participants exceeded the optimal range of steps and time while performing certain tasks under CPOE Medication order, Image order, CDS, Implantable device list and Demographics. Participants found that adding a new CDS intervention was slightly difficult as they struggled while saving the CDS interventions. Recording and changing the demographics details were perceived to be slightly troublesome by the participants. With no search option and a cumbersome selection process, participants struggled to select the correct ethnicity and race fields. The implantable device list interface proved to be the higher risk area, since the implantable device list is a newly added module to the EHR. The participants were not familiar with its functionality. The attributes pertaining to the UDI can be accessed only after making a precise entry, as the information is dependent on a third party. Also, manual entry of the UDI, makes it a tedious task for the participants. Overall, the testing process evidenced that the participants were exceeding the optimal time and the presence of path deviations. However, the participants were able to perform all the tasks without any error. The performance of the participants during the testing process, depicts zero error; which describes the usability of the software. 14
Areas for Improvement The first area of improvement belongs to CDS, since most of the participants faced difficulty while configuring the interventions. There was also some confusion as to where the alert would be shown while interacting with the product. Therefore, it will require a thorough review during the end user training, to improve the user experience. The second area of improvement identified was the Implantable device list. This is a newly added module, so most users were unsure of how to add the UDI code. They spent extra time trying to enter the UDI into the text box on the main page, when the UDI needed to be entered after they had pressed the add button. This will need to be addressed within the end user training. 15
INTRODUCTION The EHR under test for this study was My Vision Express, 2018. Tailored to present medical information to healthcare providers in the ophthalmology field. My Vision Express, 2018, provides care planning, patient education, prescription writing, and others as the core electronic health record (EHR) capabilities along with practice management services. My Vision Express also provides important data security capabilities including automatic backup and logoff, encrypted data transfer, recovery protection, secure remote access, password protection, and unique user IDs. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface and provide evidence of usability in My Vision Express, 2018. To this end, measures of effectiveness, efficiency and user satisfaction such as task time, deviations and success rate were captured during the usability testing. METHOD Participants A total of 10 participants were tested on My Vision Express, 2018. Participants in the test were health care providers and other health care personnel. Participants were recruited by My Vision Express Staff, and were not compensated for their time. In addition, participants did not have a direct connection to the development of or to the organization producing My Vision Express, 2018. Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics, representing the cohort of intended users. The following is the table of participants by characteristics, including demographics, professional experience, and computer experience. Participant names were replaced with Participant IDs, so that individuals’ data cannot be tied back to individual identities. No assistive technologies were used for any of the participants. 16
Assistive Technology? Computer Experience Product Experience Occupation/Role Participant ID Professional Experience Education Age Sex Master's Business MVE01 20-29 Female 28 3 144 No Degree Analyst Master's MVE02 30-39 Male Test Lead 120 24 120 No Degree Senior Master's MVE03 20-29 Female Business 70 6 192 No Degree Analyst Bachelor's QA MVE04 20-29 Female 60 12 120 No Degree Analyst Master's MVE05 30-39 Female Technician 48 8 132 No Degree Master's MVE06 40-49 Male Manager 180 12 180 No Degree Bachelor's MVE07 30-39 Female Optician 96 6 180 No Degree Associate MVE08 30-39 Male Optician 240 8 336 No Degree 17
Customer Master's MVE09 20-29 Female Support 72 9 110 No Degree Executive Associate MVE10 30-39 Male Technician 144 10 192 No Degree Ten participants matching the description of intended users were recruited, and all ten participated in the usability test. No participants failed to show up for the study. Participants were scheduled for 1 session of 60 minutes with time after the session for an overview by the administrator and data logger and to reset systems to proper test conditions. A spreadsheet was used to keep track of the participant schedule and included each participant’s demographic characteristics. Study Design Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of My Vision Express, 2018. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with My Vision Express, 2018. Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Task deviations • Participant’s satisfaction ratings of the system 18
Additional information about the various measures can be found in the Usability Metrics section of this report. Tasks A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with My Vision Express, including: • Computerized Physician Order Entry (Medication, Laboratory & • Diagnostic imaging) • Drug-Drug and Drug-Allergy interaction checks • Patient Demographic Changes • Problem list • Medication list • Medication allergy list • Clinical Decision Support (CDS) • Implantable device list • Clinical Information Reconciliation and Incorporation These tasks were selected based on the twelve ONC CEHRT2015 certification criteria, considering frequency of user interactions, potential risks of user errors & criticality of function. (See Appendix 5: Safety Enhanced Design Test Scenarios). Procedure Upon arrival, participants were greeted. Their identity was verified and matched to the names on the participant schedule. Participants were then assigned a participant ID. Each participant had already reviewed, signed and returned the informed consent and non-disclosure agreement shown in Appendix 2 & 3 respectively. A representative from the test team witnessed the participant’s signature. 19
To ensure that the test ran smoothly, two staff members participated in this test— the usability administrator and the data logger. The usability testing staff members conducting the test were experienced usability practitioners. The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, provided clarifications of directions as appropriate and took notes on participant comments. A second person served at the data logger and took notes on task success, task deviations and task failures. Participants were instructed to perform the tasks: • As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks but not instructions for use without reductions in ratings. • Without using a think aloud technique The administrator read the tasks aloud to the participants and then instructed them to initiate performing the tasks. The participants had the written copies of just the tasks. Exact details of the procedures used and tasks performed are shown in Appendix 5. The task time was stopped once the participant indicated they had successfully completed the task. At the end of each task, the participant rated the question on a scale of 1 – 5 with 5 being the easiest and 1 the most difficult. Time was recorded from the declaration to begin until the participant either completed the task successfully or failed the task. Following the session, the administrator gave the participant the post-test questionnaire (e.g., the System Usability Scale, see Appendix 6) and thanked each individual for their participation. Participants’ demographic information, task success rate, task performance time, task failure, task standard deviations and post-test questionnaire were recorded into a spreadsheet. 20
Test Location The test facility included a testing room with a table, computer for the participant, and recording computer for the administrator. The participant, the administrator and the data logger viewed the screen simultaneously. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. All of the safety instruction and evacuation procedures were valid, in place, and visible to the participants. Test Environment The My Vision Express Software would typically be used in a healthcare office or facility. For testing, the computer used was Lenovo running Windows 7 using a testing database having de-identified patient data. The participants logged in via GoToMeeting from their respective workstations. Participants used the mouse and keyboard when interacting with My Vision Express, 2018. My Vision Express can function on a variety of screen sizes and resolutions. In order to display well on all monitors, a resolution of 1366 x 768 was used. My Vision Express, 2018 was set up by the in-house Information Technology staff members as it would be for a typical installation. It was connected to the Internet wired connection. Test Forms and Tools During the usability test, various documents and instruments were used, including: 1. Informed Consent & Non- Disclosure Agreement 2. Usability Instructions 3. Post-test Questionnaire Examples of these documents can be found in the Appendices section. Prior to the commencement of the test, the participants were provided with the informed consent and non- disclosure agreement shown in Appendix 2 & 3 21
respectively. All participants signed and returned the form. Also prior to the test, participants were given the rating metrics, so they would have them available during testing for reference. Rating metrics sent were: 1 – Very Difficult 2 – Difficult 3 – Neither Difficult nor Easy 4 – Easy 5 – Very Easy Immediately following the test, participants were asked the questions on the post- test questionnaire shown in Appendix 6. Participant Instructions At the beginning of each testing session the administrator read the following instructions aloud to each participant: Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time, you will use an instance of an electronic health record, My Vision Express. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you, we are testing the system; therefore, if you have difficulty, all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. Please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary, you are able to withdraw at any time during the testing. 22
Following the procedural instructions, participants were shown the EHR and were given 30 minutes of instruction on using the EHR. Once the training was completed, the administrator gave the following instructions: For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request you not talk aloud or verbalize while you are doing the tasks. I will ask for your impressions about the task once you are done. Participants were then given a series of tasks to complete. Tasks are listed in Appendix 5. Usability Metrics According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for the users to interact with the system effectively, efficiently and with an acceptable level of satisfaction. To this end metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: • The effectiveness of My Vision Express, 2018, by measuring participant success rates and errors • The efficiency of My Vision Express, 2018, by measuring the average task time and path deviations • The satisfaction with My Vision Express, 2018, by measuring ease of use Ratings Data Scoring The following table unveils the details of how tasks were scored, the errors were evaluated and the time data was analyzed, Measures Rationale and Scoring 23
Effectiveness: Task success was determined by assigning numeric weights for various levels of task success, as follows: Task Success • Success (without assistance) = 1.0 • Partial success = 0.5 • Failure = 0.0 A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. A ‘Partial success’ if the participant was able to achieve the correct outcome with minimal assistance. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results were provided as a percentage. Task times were recorded for successes. The amount of time a task took minus the optimal time expected was recorded as the Task Time Deviation (in seconds) and was used to help determine partial success. Effectiveness: If the participant abandoned the task, did not reach the correct answer, performed it incorrectly, or reached the end of the allotted Task Failures time before successful completion, the task was counted as a failure. No task times for failed tasks or tasks that exceeded the target task time were used in calculations. Participants were not asked to attempt the task more than once. The total number of errors was calculated by averaging the number of errors counted for each task. Not all deviations were counted as errors. Task failures were also expressed as the mean number of failed tasks per participant. A qualitative account of the observed errors and error types was collected. Efficiency: The participant’s navigation path (i.e., steps) through the application was recorded. Task Deviations occur if the participant, for example, went to a wrong Deviations screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The Optimal time was subtracted from the observed time to provide the deviation time. Path deviations are reported on a qualitative level for use in recommendations for improvement. Efficiency: Each task was timed from when the administrator said “Begin” until the participant said “Done.” If the participant failed to say Task Time “Done,” the time was stopped when the participant ceased performing the task. Only task times for tasks that were successfully completed and tasks that were completed at or under the target time were included in the average task time analysis. 24
Average time per task and variance measures were calculated for each task for use in the result analysis. Satisfaction: Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task Task Rating question as well as a post-session questionnaire. After each task, the participant was asked to rate, “Overall this task was,” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. Results Data Analysis and Reporting The results of the usability test of specific safety –design related tasks were calculated according to the methods specified in the Usability Metrics section above. The usability testing results for My Vision Express are detailed below. A numerical summary enclosing all the tasks covered is shown in a tabular form in the Executive Summary section of this report. The results should be seen in light of the objectives and goals outlined in the Study Design section. The data yielded actionable results that, when corrected, will yield a material, positive impact on user performance. Discussion of the Findings The test findings are discussed below: 1. (a.1) CPOE- Medication Order Effectiveness- The success score for changing a medication order and displaying the changed order was 100%. However, some of the participants faced slight difficulty while recording the medication orders. The success rate for recording a medication order via CPOE was 95%. Efficiency- Though the participants exceeded the optimal time while performing the tasks, all the participants completed the tasks within the optimal number steps, as suggested by the number of steps taken by expert users. 25
Satisfaction- The participants were familiar with these tasks and gave an average satisfaction rating of 4.7 out of 5 points on a Likert scale. 2. (a.2) CPOE Laboratory Order Effectiveness- The success score for each of the tasks under CPOE Lab order was observed to be 100%. Efficiency- Most of the participants exceeded the optimal time for each task, however they successfully performed the tasks within the optimal number of steps, as suggested by the number of steps taken by expert users. Satisfaction- The participants were familiar with these tasks and gave an average satisfaction rating of 4.6 out of 5 points on a Likert scale. 3. (a.3) CPOE Imaging Order Effectiveness- The success score for changing a CPOE image order and displaying the changed order was 100%. However, some of the participants faced slight difficulty while recording the image orders. The success rate for recording an image order via CPOE was 95%. Efficiency- The participants completed the tasks successfully within the optimal number of steps; however, they exceeded the optimal time while performing the tasks without any error. Satisfaction- The participants had an average satisfaction rating of 4.6 out of 5 points on a Likert scale. Most of the participants were familiar with these tasks. 4. (a.4) Drug-Drug and Drug-Allergy Interaction Checks Effectiveness- The participants were able to successfully trigger a drug-drug, drug- allergy interaction and adjust the severity levels, yielding a task success score of 100%. 26
Efficiency- The participants completed the tasks successfully within optimal time for each task and with either fewer steps or the same number of steps as expert users. Satisfaction- The participants were familiar with these tasks and gave an average satisfaction rating of 4.5 out of 5 points on a Likert scale. 5. (a.5) Demographics Effectiveness- The success score for recording the demographic details was observed to be 90% and changing the demographic details was 95%. The participants were able to successfully record, change and display the changed entries made under demographics, resulting into a success score of 100%. Efficiency- All the participants completed the tasks within the same number of steps as expert users. Some of the participants exceeded the optimal time while performing the tasks; however, they completed the tasks successfully without any errors. Satisfaction- The participants gave an average satisfaction rating of 4.4 out of 5 points on a Likert scale. 6. (a.6) Problem List Effectiveness- All the participants successfully recorded a new problem, changed the status of the problem and displayed the active & the historical problem list; with a success score of 100%. Efficiency- All the participants successfully completed the tasks of displaying the historical medication list within the optimal time and with optimal number of steps. However, the participants exceeded the time while recording & changing a problem to the problem list and displaying active problem list. Satisfaction- Participants had an average satisfaction rating of 4.4 out of 5 points on a Likert scale. 7. (a.7) Medication List 27
Effectiveness- The success score for recording a medication to the medication list was observed to 95%. However, the participants successfully completed the rest of the tasks, changing and displaying the active & historical medication list, with a success score of 100%. Efficiency- The participants completed the tasks of changing a medication on the medication list and displaying the active & historical medication list within the optimal time and optimal number of steps. However, the participants exceeded the optimal time and steps while recording a medication to the medication list. Satisfaction- The participants had an average satisfaction of 4.7 out of 5 points on a Likert scale. Most participants were familiar with these tasks and found the tasks easy to complete. 8. (a.8) Medication Allergy List Effectiveness- All the participants were able to successfully record & change the medication allergy and display the active & historical medication allergy list with a success score of 100%. Efficiency- All the participants were able to successfully record a medication allergy and display the active & historical medication allergy list within the optimal time and optimal number of steps. Participants exceeded the optimal time while changing the medication allergy; however, they completed the tasks without any errors. Satisfaction- The participants had an average satisfaction of 4.7 out of 5 points on a Likert scale. Most participants were familiar with these tasks and found the tasks easy to complete. 9. (a.9) Clinical Decision Support Effectiveness- The participants were having little trouble with adding CDS interventions and triggering the CDS interventions from the required elements; as revealed by the success score of 90% and 95% respectively. The success score for viewing the resource information using the Info button standard, triggering the CDS interventions during reconciliation and accessing the attributes of the triggered CDS intervention alert for Problem List, was identified to be 100%. 28
Efficiency- Participants exceeded the optimal time and optimal number of steps while adding CDS interventions for the required elements. However, all the participants were able to trigger the CDS interventions, view the intervention information and access the attributes; successfully within optimal number of steps and optimal time, as suggested by timings and steps by expert users. Satisfaction- The participants gave an average satisfaction rating of 4.5 out of 5 points on a Likert scale. 10. (a.14) Implantable Device List Effectiveness- The success score for each of the tasks was 100%. Participants were able to record the UDI, change the UDI status and access UDI, device description, identifiers, and attributes successfully. Efficiency- All the participants were able to perform the tasks within optimal number of steps and time. The participants exceeded the time while recording unique device identifier, however completed the task without any error. Satisfaction- Since, it was a newly added EHR component, the participants were unfamiliar with the tasks. They had an average satisfaction rating of 4.3 out of 5 on a Likert scale. However, the participants performed the tasks easily. 11. (b.2) Clinical Information Reconciliation and Incorporation Effectiveness- The success score for incorporating a CCDA & conducting reconciliation as well as generating a new CCDA with reconciled data was observed to be 100%. Efficiency- All the participants exceeded the optimal number of steps and time while incorporating a CCDA & conducting reconciliation and generating a new CCDA with reconciled data. However, the participants completed the tasks successfully without any errors. 29
Satisfaction- The participants had an average satisfaction of 4.7 out of 5 points on a Likert scale. Most participants were familiar with these tasks and found the tasks easy to complete. Major Findings During the testing process, it was observed that the participants performed the majority of tasks within expected number of steps. However, it was revealed that some of the participants exceeded the optimal range of steps and time while performing certain tasks under CPOE Medication order, Image order, CDS, Implantable device list and Demographics. Participants found that adding a new CDS intervention was slightly difficult as they struggled while saving the CDS interventions. Recording and changing the demographics details were perceived to be slightly troublesome by the participants. With no search option and a cumbersome selection process, participants struggled to select the correct ethnicity and race fields. The implantable device list interface proved to be the higher risk area, since the implantable device list is a newly added module to the EHR. The participants were not familiar with its functionality. The attributes pertaining to the UDI can be accessed only after making a precise entry, as the information is dependent on a third party. Also, manual entry of the UDI, makes it a tedious task for the participants. Overall, the testing process evidenced that the participants were exceeding the optimal time and the presence of path deviations. However, the participants were able to perform all the tasks without any error. The performance of the participants during the testing process, depicts zero error; which describes the usability of the software. Areas for Improvement The first area of improvement belongs to CDS, since most of the participants faced difficulty while configuring the interventions. There was also some confusion as to 30
where the alert would be shown while interacting with the product. Therefore, it will require a thorough review during the end user training, to improve the user experience. The second area of improvement identified was the Implantable device list. This is a newly added module, so most users were unsure of how to add the UDI code. They spent extra time trying to enter the UDI into the text box on the main page, when the UDI needed to be entered after they had pressed the add button. This will need to be addressed within the end user training. APPENDICES- 1. Participant Demographics 2. Informed Consent 3. Non- Disclosure Agreement 4. Usability Instructions 5. Safety Enhanced Design Test Scenarios 6. Post- Test Questionnaire APPENDIX 1- PARTICIPANT DEMOGRAPHICS Below is a summary of participant demographics for this study 1. Participant Gender Participant Gender (N=10) % Male 4 40 Female 6 60 2. Participant Age Participant Age (N=10) % 20-29 4 40 30-39 5 50 40-49 1 10 3. Participant Education 31
Participant Education (N=10) % Associate’s Degree 2 20 Bachelor's Degree 2 20 Master's Degree 6 60 4. Participant Roles Participant Role (N=10) % Business Analyst 1 10 Test Lead 1 10 Senior Business Analyst 1 10 QA Analyst 1 10 Technician 2 20 Manager 1 10 Optician 2 20 Customer Support Executive 1 10 5. Participant Professional Experience Participant Professional Experience N=10 Months Years Mean 105.8 8.81 6. Participant Computer Experience Participant Computer Experience N=10 Months Years Mean 170.6 14.21 7. Participant Product Experience Participant Product Experience N=10 Months Years Mean 9.8 0.81 32
APPENDIX 2- INFORMED COSENT FORM My Vision Express would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about approximately 60minutes. Agreement I understand and agree that as a voluntary participant in the present study conducted by My Vision Express, I am free to withdraw consent or discontinue participation at any time. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared outside of My Vision Express and My Vision Express clients. I understand and agree that data confidentiality is assured, because only de- identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. 33
I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: [ ] YES, I have read the above statement and agree to be a participant. [ ] NO, I choose not to participate in this study. Participant’s printed name- ………………………………………..…………… Signature- …………………………………………………………………...….. Date- ………………..…………………………………………………………… Email Address- ………………………………………………………………….. APPENDIX 3- NON- DISCLOSURE AGREEMENT This agreement is entered into as of _____________, 2017, between ___________________________ (“the Participant”) and My Vision Express. The Participant acknowledges his or her voluntary participation in today’s usability study may bring the Participant into possession of Confidential Information. The term "Confidential Information" means all technical and commercial information of a proprietary or confidential nature which is disclosed by My Vision Express, or otherwise acquired by the Participant, in the course of today’s study. By way of illustration, but not limitation, confidential information includes trade secrets, processes, formulae, data, know-how, products, designs, drawings, computer aided design files and other computer files, computer software, ideas, improvements, inventions, training methods and materials, marketing techniques, plans, strategies, budgets, financial information, or forecasts. 34
Any information the Participant acquires relating to this product during this study is confidential and proprietary to Test Company and is being disclosed solely for the purposes of the Participant’s participation in today’s usability study. By signing this form the Participant acknowledges that s/he will not receive any monetary compensation for feedback and will not disclose this confidential information obtained today to anyone else or any other organizations. Participant’s printed name- ………………………………………………………… Signature- …………………………………………………………………………… Date- …………………………………………………………………………………. APPENDIX 4- USABILITY INSTRUCTIONS Usability Instructions (MM/DD/YYYY) Administrator Data Logger Date Time Participant ID Orientation We express our gratitude towards you for participating in this study. You will be helping us evaluate the workflows pertaining to the ONC’s EHR vendor certification requirements. Our session today will last approximately 60 minutes. During that you will be provided with an overview in relation to My Vision Express and the modules involved under this testing. 35
The product you will be using today is My Vision Express, which you are already familiar with. It is, however, a non-production version, so all patients are fake. As you go through the workflows, please keep in mind that it is My Vision Express under review here, not you. During this study, you will be asked to complete a few tasks using My Vision Express and answer some questions. We are interested in obtaining feedback regarding the ease of use of this product, what in the product you find most useful, and feedback as to how it can be improved. You will be asked to complete these tasks on your own, trying to do them as quickly as possible with the fewest possible errors and deviations. Please save your detailed comments until the end of a task or the end of the session as a whole, when we can discuss freely. Please feel free to be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Do you have any questions or concerns? Preliminary Questions Okay, now we need to ask you a few questions about yourself. What is your: Education Occupation/Role 36
Professional Experience Computer Experience Product Experience On a scale of 1 to 5, with 5 being “most familiar”, how would you rate your knowledge of the following: Computerized Physician Order Entry Drug-Drug and Drug-Allergy interaction checks Patient Demographic Problem list Medication list Medication allergy list Clinical Decision Support (CDS) Implantable device list Clinical Information Reconciliation and Incorporation Electronic prescribing 37
APPENDIX 5- SAFETY ENHANCED DESIGN TEST SCENARIOS The tasks are prioritized in accordance with the risks associated with user errors. 1. (a.9) Clinical Decision Support {User Interaction- Moderate, Risk- Moderate} Tasks 1. Add a CDS intervention and/or reference resource for each of the required elements,- a. Problem List- Non proliferative diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 b. Medication Allergy List (Rx Norm Code- 202589 Cytoxan) c. Medication list (RxNorm Code-637175 Keflex 750 mg) d. Demographics- (Ethnicity- Hispanic or Latino) e. Combination- Age>50 and Vitals- BMI- >25 Expected Task Time Actual Task Time 38
Optimal Path: Correct Deviations If any, no. of deviations observed________ Observed errors: Rating: On a scale of 1 to 5, with 5 being “very easy to use”, how would you rate the ease of use for completing this task? 1 2 3 4 5 2. Trigger the CDS interventions/resources added using the applicable data elements from each of the required elements- a. Problem List- Non proliferative diabetic retinopathy. ICD10- E11.329 SNOMED- 390834004 b. Medication Allergy List (Rx Norm Code- 202589 Cytoxan) c. Medication list (RxNorm Code-637175 Keflex 750 mg) d. Demographics- (Ethnicity- Hispanic or Latino) e. Combination- Age>50 and Vitals- BMI- >25 Expected Task Time Actual Task Time Optimal Path: Correct Deviations If any, no. of deviations observed________ 39
You can also read