Upload
grayson-seckler
View
219
Download
0
Tags:
Embed Size (px)
Citation preview
© CGI Group Inc. CONFIDENTIAL
Low-Cost Rapid Usability Testing: Is it Worth the Effort?
Tristin Baylis, MScSenior Technical Consultant
2
Overview
•Background
•What is Usability Testing
•Cost-benefit Analysis
•Results
•Questions
3
Faculty/Presenter Disclosure
Presenter: Tristin Baylis
Relationships with commercial interests:
•Tristin Baylis is a Senior Technical Consultant with CGI
•This Study was conducted with the support of Andre Kushniruk – University of Victoria
•Permission was granted by the BC Ministry of Health
What is Usability Testing
$19 Billion
Usability testing is a branch of usability engineering that focuses on analyzing and improving user interactions with computer systems.
Usability Testing
Traditional Usability Test Labs
Low-cost Rapid Usability Testing
“Low-Cost Rapid Usability Testing” is a form of usability testing that removes the need for the expensive test lab
A portable laboratory is created that may involve as little equipment as a laptop, a video camera and screen recording software taken into a real-world setting where testing is conducted.
Low-Cost Rapid Usability Testing also implements a simplified approach to analysis of video data obtained from testing sessions.
Low-cost Rapid Usability Testing
Low-cost rapid usability testing requires a small amount of physical materials. The basic testing setup requires a Laptop to run the software and equipment to record the users actions. The costs of all materials were tracked.
Cost-Benefit Analysis
Cost-Benefit Analysis
1. Conducting Low Cost Rapid Usability Testing
2. Analysis of Costs
3. Cost-Benefit Analysis
BC Ministry of Health CDM Toolkit
Chronic Disease Management (CDM) Toolkit: A secure web application that was developed as an information management and technology decision support tool to be used to support chronic disease management.
Conducting Low Cost Rapid Usability Testing
Low-cost Rapid Usability Test Procedure:
1. Pre-Test Setup: Test equipment was setup, questionnaires prepared and testing data set to a consistent start point.
2. Overview: Before starting testing subjects were given an overview of testing process and asked to sign a consent form.
3. Performs Test Scenarios: 8 subjects were asked to complete 2 predefined test scenarios using the CDM Toolkit (Computer screens , audio and external interactions were recorded).
4. Post –Task Interview: When the subjects were done testing they were then asked several open ended questions while still being recorded.
5. Questionnaire: The subjects were then asked to complete a questionnaire about there overall impression of the system and the issues they encountered.
6. Analysis: The data collected (video recordings, screen capture and questionnaires) was then analyzed to determine application errors that were encountered and the associated severity of each error.
Results: Low-cost Rapid Usability Testing
Cost-benefit Analysis
The cost-benefit analysis was completed by comparing the cost of performing Low-Cost Rapid Usability Testing using 3 different scenarios:
1. Direct Measurable Savings
2. Cost of Errors related to when resolved in the SDLC
3. Cost of Medical Error
Analysis of Costs
The first step of completing the cost-benefit analysis was to determine the total cost of the Usability testing which came to $8,362.91
Cost-benefit analysis
As indicated in the table below a cost savings was found in all 3 cost-benefit cases
Discussion: Cost-Benefit Analysis
In all cases a positive cost-benefit was found with savings attributed to the implementation low-cost rapid usability testing.
The average cost savings was $15,774.45, which meant a total percent savings of 59.9% compared to the impact of errors going undetected and potentially causing a technology-induced error.
It should be noted that the most conservative case was used in all cost-benefit analysis. If the extreme case was considered (i.e. Each Medical Error costing $600,000 ) a total cost savings of $1,791,637.09 could possibly be achieved.
Conclusion
Early detection of errors (i.e. prior to production release) will allow organizations to achieve a 36.5% to 78.5% cost saving compared to the impact of errors going undetected and causing a medical error.
Overall, Low-Cost Rapid Usability Testing was found to be a cost effective testing technique that can be implemented in conjunction with other testing techniques (e.g. unit testing, black box testing, white box testing, clinical simulations) in a cost effective manner to develop health information systems which will have a lower incidence of technology-induced errors.
20
Questions?
References
BC CDM Toolkit Documentation and Guides. Retrieved March 19, 2011 from, http://www.health.gov.bc.ca/access/guides/index.html
BC Medical Services Commission Payment Schedule. Retrieved August 25, 2010 from, http://www.health.gov.bc.ca/msp/infoprac/physbilling/payschedule/index.html
BC Ministry of Health Services Drug Data Files. Retrieved August 25, 2010 from, http://www.health.gov.bc.ca/pharmacare/outgoing/index.html
Jaspers MW. (2009). A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform. May; 78(5):340-53.
Jeffries, R., Miller, J. R., Wharton, C., Uyeda, K.M. (1991). User interface evaluation in the real world: A comparison of four techniques. Proc. ACM CHI’91 Conf. (New Orleans, LA, April 28 May 2), 119,124.
Karat, C., Campbell, R., Fiegel, T. (1992). Comparison of empirical testing and walkthrough methods in user interface evaluation. In Proceedings of CHI'92 (Monterey, California, May 3- 7, 1992), ACM, New York, 397-404.
Karat, C. (1993).Usability Engineering in Dollars and Cents, IEEE Software, May 1993, pp. 88-89.
Kushniruk, A. W., & Patel, V. L. (2004). Cognitive and usability engineering methods for the evaluation of clinical information systems. Journal of Biomedical Informatics, 37(1), 56-76.
Kushniruk, A. W., Borycki, E. M.,(2006). Low-Cost Rapid Usability Engineering: Designing and Customizing Usable Healthcare Information Systems. Healthcare Quartely, 9(4), 98-100,102.
McConnell, S. (2004). Code Complete (2nd ed.). Microsoft Press. pp. 960.
Nielsen, J. (1993). Usability engineering. Boston: Academic Press, Inc.
Neilson J. (1994). Usability Inspection Methods. New York: Wiley.
Nielsen, J. (1994).Using discount usability engineering to penetrate the intimidation barrier, in R.G. Bias & D.J. Mayhew (eds), Cost-Justifying Usability, academic Press.
Appendix A: Subject Selection Criteria
Subject Criteria
Criteria Description
Subjects All subjects were British Columbia Ministry of Health Services employees that currently use the CDM Toolkit. Subjects were to fall into one of the following categories of users:Physicians, nurses, medical office assistants, administrators
Selection Criteria
The following criteria were used to exclude subjects:
Familiarity with basic computer/internet use (i.e. searching info on the web, navigational tools such as a mouse, web interface) was required.
Previous exposure to the application being studied was required.
Must be proficient in the English language.
Subjects could not be affiliated with the development and implementation of the
application being studied.
Subjects could not be application stakeholders.
Appendix B: Basic Usability Savings
Appendix C: Basic Usability Savings
Appendix D: Measurable Cost of Medical Error
Appendix E: Complete cost of Medical Error