Upload
naomi-dorsey
View
220
Download
0
Embed Size (px)
DESCRIPTION
How do we measure usability? Goals determine techniques Techniques determine test designs Test designs determine results Results must be interpreted
Citation preview
Evaluating the Usability of Web-based ApplicationsA Case Study of a Field Study
Sam J. Racine, PhDUnisys Corporation
What does “Usability” mean?The measure of how a given user operates a given interface in a given context effectively, efficiently, and with satisfaction
How do we measure usability?Goals determine techniques Techniques determine test designsTest designs determine resultsResults must be interpreted
What is “Contextual Inquiry”?Ethnographic observation of users in their environmentFlexible methodology that produces open-ended resultsTechnique that is excellent for beginning or revisiting a UI design process
What does contextual inquiry provide?An understanding of what users are really doing day to day, in order to translate their tasks into an effective UI design
Contextual inquiry applied
Unisys LMS Enterprise Services (ES) a web-based cargo management application
• Basis for “sister” applications
• Basis for other web-based applications in similar environments and markets
• Desire to build on our successes (and failures)
Our goalsDetermine the real-world value that users (not customers) assign to our applicationLearn what users really do, not just what they say they doSort through the “lore” from the “reality”
Our technique
Field study methodology to learn about• users: their background, needs, and working environment
• interactions with the application: types of tasks, frequency, and completion time
• available support material and derived work-arounds
• ‘training’ and to whom users go for help
External factors affecting test designCustomer site
• Management
• Employees
• Contractors
• Schedules
StakeholdersLogistics
Internal factors affecting test designExpertisePersonnel availabilityBudgetMore stakeholdersMore logistics
Our test designTwo weeks, two evaluatorsFirst week observation; second week analysisFirst week ‘note taking;’ second week discussion and response to users’ questionsOpen-ended details:
• Daily revamp of test design
• Intervening weekend finalized second week’s details
“Rules” for data gatheringNote comments verbatimFor a survey, repeat questions exactlyFor information, customize approachNote response and participant and contextRespect each evaluator’s approachNote what users do as well as what they sayBe flexible
“Rules” for sorting data
Pay attention to what users do• but don’t discount what they say
Allow categories to emerge from dataCategorize after collecting data
• no preconceptions
Assign weight according to user needs and yours separatelyDon’t dismiss anomalies
• measure against the participant and context
“Rules” for dealing with users and stakeholders
Work according to their comfort level• not yours
Treat each user as a CEOKnow communication valued by stakeholdersBe prepared with multiple arsenals of communication
• nonverbal like diagramming or picture drawing
• bring a camera
Appreciate, don’t bribe
The test
Step 1: PreparationEstablish climate and expectations
• Announcement to participations• Confirmation of arrival logistics• Practice observation
Step 2: Observation onlyGather data
• “Take a note”• No “answers” or “corrections”• Participation survey and identification• Investigation of expressed concerns by managers and users• Note everything!
The test
Step 3: AnalysisSort observations according to organic categories
• Sites of difficulty• Clustered impressions• Patterns• Preliminary Findings• Training opportunities
Step 4: Directive ActivitiesTest categories and verify findings
• Cognitive walkthroughs• Directed activities• Demonstrations
Findings: Data
Internet is a new creature• changed expectations
• changed operation
Training and orientation two different thingsUsers value accuracyDomain experts require supportive UI“Old” eyes are everywhere
Findings: MethodologyA ‘true’ source exists for most complaintsMultiple sources doesn’t guarantee validityOrder in which data is discovered affects interpretationSignificance is assigned by users and evaluators
What did we do well?Had patience, patience, patienceStaggered across times and shiftsSwallowed our pride and rejected seductionAsked for helpMade the most of what we hadTimed our exitBrought a camera
What could we have improved?Extended to multiple sitesSought a different quarterBetter pre-testing, identification, and administration of surveysBrought a video camera
RecommendationsContextual inquiry requires
• observation
• communication
• detection
• theoretical foundation
In other words, use a professional!
Questions?
Thank you!Sam Racine, Unisys [email protected]