54
Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected] Evaluating a product's usability and the broader user experience it offers For the Society for Technical Communication, Nov 21 2002

Evaluating a product's usability and the broader user experience it offers

Embed Size (px)

DESCRIPTION

Evaluating a product's usability and the broader user experience it offers. For the Society for Technical Communication, Nov 21 2002. Agenda. Tell you how I conduct evaluations Jump right in to usability evaluation 4 types of evaluations Case projects - PowerPoint PPT Presentation

Citation preview

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Evaluating a product's usability and the broader user experience it offers

For the Society for Technical Communication, Nov 21 2002

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Agenda

• Tell you how I conduct evaluations• Jump right in to usability evaluation

– 4 types of evaluations– Case projects

• Bits and pieces related to conducting evaluations• Experience evaluation, focus groups• Discussion (throughout)

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Similar disciplines

Instructional Systems Design (ISD)• Based on learning and

instructional theory• Aims:

– improve human performance– increase efficiency and

effectiveness– ensure the quality of instruction– maximize the learning experience

Human Computer Interaction (HCI)• Based on human information

processing theory (perceptual, cognitive, and motor)

• Aims:– create successful interaction

between people and computers– maximize performance of human

and computer together as a system

How do these common models compliment one another?– ADDIE ISD model– User-Centered Design model (UCD)– Software engineering lifecycle

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Types of products

• Paper-based• Computer-based• LAN-based• Internet-based• Electronic books on PDAs or information appliances

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability

Usability – A useable application allows the user to focus on the task at hand, not

on the application

• Reach usability– Match the way users work– Behave predictably– Support user’s cognition and perception skills

• Usability evaluations– … are a confirmation or disputation of how well an application works for

the users, not how well the users perform with the application– Results provide excellent input for design improvement

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

User experience

Everything felt, observed, and learned through awareness and interaction with a company’s space, products, services, and communication. (HannaHodge)

• Encompasses all potential user touch points– Application graphical user interfaces– Documentation– Training– Customer contact systems

• Customer Touch Points Strategy• The heart of user experience design is reaching high

user satisfaction AND strong human performance

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Evaluation types in the software world

• Common confusion around types of evaluation activities• Three broadly-defined types of evaluations

– Specification Compliance - Does the software comply with the specification for the software?

– Software Performance - Does the software meet business goals for operability and performance?

– Usability and Experience - Does the software meet the needs and desires of the direct and indirect users?

• Make sure your client understands the differences and has the right expectations

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Reasons for evaluation

• Evaluate before the users do .. they will sooner or later

– Suggest improvements to the design– Confirm that the product meets the usability specifications– Confirm acceptability of interface and/or supporting materials– Ensure that it meets customer expectations– Compare alternative designs (depoliticise the comparison of designs)– Match or exceed usability of competitor’s products – Ensure that it complies with any statutory requirements such as ISO or

accessibility

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

User needs drive requirements; huge impact if the user needs aren’t properly represented

1 User need

10 Features

100 Tech spec

(Alan Cooper)

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

User-centred approach System-centred approach

User-centered design and usability

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Common complaints about usability evals

• Insufficient audience sample• Providing sufficient rationale for judgements• Backing up your findings• Test and analysis rigor• “Ad hoc” about test structure• Communicating findings to IT, Marketing, shareholders

sympathetically• Dogmatism and extremism

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Some suggestions

• Define test objectives with client• Define recruitment criteria with client• Add more structure to the evaluations• Keep track of task completion• Improve test moderation skills• Conduct more thorough analysis• Provide meaningful recommendations

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability as design and evaluation

Formative evaluations• early stages of the development

lifecycle• iterative design refinement• tend to be structured and informal,

inexpensive, and rapid

Summative evaluations• conclusion of a development effort• quality control and standards

compliance• tend to be formal, statistical,

expensive, and time-consuming

• Usability evaluation is commonly thought to be an evaluation of a product after it’s been developed

• Very powerful in the design phase, e.g. “I design to support usability”• When you evaluate depends on the goals for evaluation and the state of the product

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Different evaluations for different phasesGoal Phase of development Type of evaluation

Identify usability issues very early in the process: at the first stage of prototyping

Paper mockup Usability walkthrough

Identify how well your product meets the usability guidelines and heuristics, not necessarily discrepancies between user needs and the design

Electronic mock-upApplication prototype

Heuristic evaluation

Directly observe how well your product works for users

Application prototypeCoded application

Usability test

Identify usability issues, with regard to discrepancies between user needs and the design

Application prototypeCoded application

Expert evaluation

Inspect how well your product conforms to established standards

Coded application Compliance audit

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Discuss 4 of these types

• Usability walkthrough– Prototyping– Scenarios

• Heuristic evaluation– Heuristics

• Usability testing– Moderating the test– Test plans

• Expert evaluation– Analysis

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability walkthrough

The Basics• Early usability evaluation• Paper prototype• With users (1-6+)• Facilitated by researcher• Scenarios• Conference room

Case: Australia Wheat BoardOnline trading of wheat

What’s needed– Write scenarios– Create prototype– Make copies of prototype for each user– Dry run the walkthrough

During the evaluation– Describe the process to participants– Introduce the scenario– Step through the scenario using the mockup– Prompt for user feedback– Instruct users to write down the actions they would

take– Record comments

After the evaluation– Analyze all of the comments– Priorities the issues– Make recommendations for prototype improvement– Revise the prototype

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Prototyping

Benefits• Requirements capture• Reveals problems/prevents gross

mistakes• Allows evaluation and discussion

from designers and users• Users feel involved• Results in better usability• Economical way of testing designs

Stages of prototyping• Paper-based (low fidelity)

– Sticky notes with labels arranged on a piece of paper

– Users write all over the prototype, move sticky notes around, draw pictures

– Printouts from PowerPoint mockup can be used, but use handwriting font

• Electronic mock-ups– PowerPoint

• High fidelity– In the delivery medium

Prototyping is central to design iteration and refinement; iterate 4-5 times

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Writing scenarios

• Narrative that is written by the researcher from information gained from SMEs and/or users

• Scenarios describe:– Users and their goals– Work practice– Actions the user will take to accomplish goals– Responses from the product

• Aspects– Paint a picture of the ideal usage– Keep them ‘technology agnostic’

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Scenario example

• Suzanne and Greg are very excited about their move to Australia. They’ve never lived abroad and aren’t sure if they’ll be able to afford a nice place to rent. They decide to look on the Internet for places to rent to see what they can afford.

• After they connect to The Age’s website, it is obvious where to find the rentals section. It is clear and easy to understand.

• They find a house in a desirable area, but the house they have found is too expensive, so they look for other houses.

• They are pleased to find a number of different types of houses. They select all of the houses they are interested in, and print those.

• They decide to check out another newspaper’s listing. The experience there is frustrating by comparison. They cannot tell what neighborhoods the houses are in, and cannot tell where the neighborhoods are in relation to the CBD.

• Frustrated with this site, Suzanne and Greg return to The Age’s website to continue exploring houses in other neighborhoods. They are pleased to see that the site remembers their selections.

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Heuristic evaluation

The Basics• Midpoint usability evaluation• Electronic mockup• Without users• Heuristics (rule of thumb for

good design)• Evaluator(s)• Private office

Case: Australia VinylPublic information website

What’s needed– Know the evaluation goals– Design and validate the heuristics– Have the list of the heuristics and their definitions

in-hand– Evaluator(s)– Time

During the evaluation– The evaluator familiarizes with the product enough

to know how to get around and what can be found– Systematically go through each heuristic noting

any violations against it– Assess the macro and the micro elements– Identify the element that violates the heuristic– Describe how the element violates the heuristic

After the evaluation– Analyze all of the violations– Make conclusions about the larger impact of the

violations– Make design recommendations that solve the

problems caused by the violations

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability heuristics

Usability Heuristics for Software (easily repurposes for paper-based)• Match between the system and the real world• Consistency and standards• Visibility of system status• Error prevention• Error recovery• User control and freedom• Visual feedback• Aesthetic and minimalist design• Recognition rather than recall• Make the user smart• Flexibility and efficiency of use• Help and documentation

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Abbreviated heuristics

• Content• Interaction and level of engagement• Navigation and efficiency of use• Orientation• Presentation and visual integrity• Structure and hierarchy of information

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

User experience heuristics

• Provides a rewarding experience• Appropriately challenging• Not anxiety or phobia producing• Inviting; not intimidating• Fosters curiosity; is inspirational• Fosters a desire for accomplishment• Worthy of exploration• Browsing is rewarded• Supports the user's sense of style• Provides an interesting experience• Effectively manages distance between the author and user personas (the

voice of the product versus the voice of the product as relates to the user)

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability testing

The Basics• Late usability evaluation• Structured• Coded application• With users• Test tasks• Facilitator• Researcher• In the field or in the lab

Case: HPPublic website

What’s needed– Know the product’s purpose, evaluation goals,

and target audience– Decide the test environment– Decide the location of the evaluator: next to

user or in separate room– Recruit users according to criteria– Decide how much qualitative versus

quantitative– Create the test structure– Decide on think-aloud protocol– Design the user test tasks– Create a test script (play-by-play)– Review good communication practices for

facilitating a test– Dry run the test con’t …

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Usability testing

During the evaluation– User is welcomed and fills out a

permission to videotape/audiotape– User pre-survey– Moderator briefs the user and

establishes rapport– Introduces user to the test– User performs tasks while thinking

aloud– Moderator keeps in close contact with

user– Moderator performs measurements

and takes notes– Moves user to next task– Good moderation practices are

followed– User is thanked– Notes are finalized, and user leaves– Next user is welcomed

After the evaluation– Finalize all of the notes and artefacts– Separate qualitative from quantitative

Affinity diagram all of the findings– Make conclusions about the larger

impact of the issues– Make design recommendations that

solve the problems caused by the usability issues

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Approaches to usability evaluation

Very rough !!

• Traditional HCI Approach………....30 days, $40k+

– Formal method, lab coat, stopwatch, metrics-based• Discount Usability (Nielsen)….……2-7 days, $1k-8k

– Also known as “Guerilla HCI”, qualitative• Structured Quick Approach…....7+ days, $15-20+k

– qualitative and some quantitative

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Guerilla versus structured

Guerilla Test• broad test objectives• 5 participants• 1-1.5 hour sessions• run with user activity scenario• probe as-you-go• qualitative only• minimal noting

Structured Test• narrow test objectives• 6-8+ participants• 1.5-2 hour sessions• run with appropriate test tasks• conduct task benchmarking (no

probing until the task is complete)• qualitative and quantitative -

capture times, success rate, etc.• more questionnaires• detailed noting

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Planning the test

• Know your users• Know the market and domain• Craft test objectives• Identify the factors to record• Define product success criteria• Choose types of test tasks• Select test tasks• Construct test tasks• Choose the evaluation environment• Sample test structure• Conducting the test

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Know your users

• Expertise level (novice, intermittent, frequent)• Familiarity with specific hardware and software• Information access needs, e.g. summary level or

detailed level• Information retrieval preferences, e.g. search, browse• Motor skill level with regard to delivery medium• General educational level• Domain knowledge and related skill level• Age, gender, other considerations

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Know the market and the domain

• Market information– Identify all the markets– Select the markets that are germane to the evaluation– Identify the types of people within those markets to

participate in the evaluation• Domain knowledge

– The domain, to a large extent, determines the context of use– Understand the domain and the human strategies and skills

invoked by that domain

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Crafting test objectives

• Broad Objectives– “Let’s find all the problem spots”

• Narrow Objectives– “Let’s identify what’s inhibiting the user’s productivity”

• When creating objectives, define them to the most narrow or detailed level possible

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Identify the factors to record

• Speed of operation• Completion rate• Error free rate• Satisfaction rating• Advanced feature usage• Path analysis• Probing• Emotions• User suggestions

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Define product success criteria

How good is the product?(report these findings)

Task Completion No. of users that like using the site

Exceeds expectations 0 errors 90%+

Meets expectations 1-2 errors 75-90%

Is minimal 2+ errors 50-75%

Is unacceptable 0 complete Under 50%

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Choose type of test tasks

• Atomized tasks– discrete, small– unlike user activity scenario tasks, these don’t make up a single story– Suitable for measurement and randomizing

• Exploratory tasks– non-directed– to capture users’ initial reactions

• User activity scenario tasks– tasks are suited to procedural applications– tasks “tell a story”– watch for this … uses often get lost in the detail, the user doesn’t identify with the

scenario• “User-designed” tasks

– tasks are designed by the users themselves• Systematic exploration of the site components

– tasks test site components instead of user tasks

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Selecting test tasks

• Typical tasks - 3– Tasks performed 80% of the time

• Critical tasks - 2– High priority tasks which may be very expensive to the

company• Problematic tasks - 2

– Known trouble spots• Infrequent task - 1

– Tasks that occur infrequently, but which may determine user satisfaction or may be expensive to the company

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Constructing test tasks

• 2-4 sentences in length• Indicate typical context of use including

– The characters and the situation– The environment where task is likely to be performed– Appropriate time pressures– Appropriate level of detail

• “Sally’s is in a rush and suddenly realizes her mom’s birthday is fast approaching. She decides to send flowers. Mom loves gladiolas. Show me how Sally goes about sending her mom flowers.”

• “Robert needs to install an updated driver for his printer. Show me how he does this.”

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Evaluation environments

• In the field– Immersed in context of use– Slightly removed from context of use

• In the conference room• In the lab or focus group site

– Researcher in room with participant– Researcher behind the one-way mirror

• Remote evaluation– Tester and observers are not in the same locale

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Note on usability labs

SCHIL usability lab at Swinburne Uni, Melbourne

Hiser’s Experience Lab, Melbourne

IDEA Lab, Melbourne Uni, Melbourne

Yellow – Test room

Blue – Observation room

Considerations• Methods and techniques• Equipment needs• Flexible and reconfigurable• Appropriate ambience: comfortable, warm, edgy environment

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Note on the lab environment

• Audio• Video• Document camera• Picture-in-picture• Microphones• Capture users’:

– Facial expressions– Pen or mouse location (scan converter)

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Sample test structure• Pre-test questionnaire (in the waiting area)• Greet participant• Explain test to participant• Optional start task

– Participant takes 10 minutes to familiarize themselves with the product and provide first impressions

• Run tasks: 1. Participant performs task2. Moderator determines success of completion3. After task has ended, probing4. Optional task evaluation5. Repeat

• Post test interview• Post test questionnaire

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Conducting the test

• Observe and record user’s behaviour and comments• Think aloud protocol• Establishing rapport• Ask open-ended questions• Dealing with difficult personalities• Users blaming themselves• Judging task success/failure• Questioning users

– Positive questions – all questions should be asked in a positive, supportive and non-threatening way

– Open – encourage an individual to talk and provide maximum information– Closed – can be answered in a few words or sentences– Probing – usually to follow up on a response to ask for more details

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Reminder

Testing is about Observation … Witnessing• It’s important to keep chatting to an absolute minimum• The participant should be speaking at least 80% of the

time• Use clear, direct language• Have a neutral, accepting style .. Not harsh and not

too familiar

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Ethical treatment of users

• General ethics– Consider users first – Safeguard users' rights,

interests, and sensitivities – Communicate research

objectives – Protect the privacy of users – Must not exploit users – Make privacy policies

available to users

• Specific practices– Permission to audio/videotape– Stated purpose of

audio/videotapes– Non-identifying artefacts– Non-identifying results in

report

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Expert evaluation

The Basics• Late usability evaluation• Application prototype of

coded app• Without users, but with

personas• User tasks in hand• Design principles, rules,

heuristics, experience• Expert evaluator• Private office

Case: ScapeEntertainment site with loyalty

program attached

What’s needed– Know the evaluation goals– Know the intent of the product and the user

tasks that need to be supported– Have a list of sample user tasks and sample

user profiles in hand– Evaluator(s)– Time

During the evaluation– The evaluator familiarizes with the product

enough to know how to get around and what can be found

– Systematically go through each user task and try to accomplish it

– Note the usability issues that are likely to frustrate users or prevent them from accomplishing tasks

– Record the issue, its impact to task accomplishment

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Expert evaluation

After the evaluation– Analyze and categories all of

the issues– Roll up the issues to comment

on the weaknesses/strengths of the product

– Provide an indication of which user tasks are most at risk of being violated

– Create a long issues table containing:

• Product component• Usability issue• Significance• Severity• Suggestions

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

After the evaluation

• Analysis and synthesis• Reporting results

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Analysis and synthesis

• Review observations or findings• Rate findings

– Severity– Priority to fix– Frequency, impact, and persistence

• Notice patterns• Draw conclusions about tendencies• Look for “root cause” of usability or satisfaction issues• Use analysis rooms and sticky notes

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Reporting results

Reporting results• “in their words”• Users’ artefacts• Testimonials• Highlights tapes• Likes as well as dislikes• Always point out the positive as well as the negative

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Additional evaluation types

• User Experience evaluation• Activity-based focus groups

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

User Experience evaluation

• Assessing the user experience strategy• Reviewing these strategies:

– Positioning, brand, content strategies

• Measuring:– Brand perception– Task completion– Error rates– Patterns of behaviour

Case: Foxtel Call Centre and CRM apps

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Focus groups

• Discussion vs. activity-based focus groups

• Some activity types:– Gallery walk– Money spend– Bulls eye– Theatre and role-playing– Collages

Case: Pharmaceutical company

• Heart Attack Victim - You’re a 36-year old man who has suffered a major heart attack, and are not health conscious … yet.

• “My name is Martin. 36 year old successful investment banker. Story of my life, living life fully. Just had a major heart attack. I’m not feeling very healthy. I don’t believe in tablets and things. I think I need alternate therapies and take it easy. The heart attack helped me become aware of how my life has changed, how hectic it was. I’ve been researching alternative medicines. I don’t believe in doctors, so I’ve been looking on the web. [probe - why don’t you trust doctors?] it’s not that I don’t trust doctors, I’ve just never been sick.”

• Analysis: fears, frustrations, desires

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Wrap up

• Design to support usability and satisfying user experience• Conduct formative evaluations early and often to iterate design• Choose an evaluation type that meets the research goals• Craft the evaluation goals and metrics with the client• “Sink in” to the role of analysis and results reporting• Usability blessings

– May you focus on what people actually do– May your product match the way your users work– May your product come into compliance with your users; not the other

way ’round– May you not stray from studying activities in the “natural” setting in which

they occur

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Industry groups

• UPA, Usability Professionals Association• ACM SIG-CHI, Association for Computing Machinery’s

Special Interest Group, Computer Human Interface• HFES, Human Factors and Ergonomics Society• AIGA, American Institute of Graphic Arts• ISPI, International Society for Performance

Improvement• STC, Society for Technical Communication

Nov 21 2002 Suzanne Currie, Usability and UI Design, [email protected]

Good books

• Usability Inspection Methods, Jakob Nielsen and Robert L. Mack, Editors

• Designing Web Usability, Jakob Nielsen• Flow: The Psychology of Optimal Experience, Mihaly

Csikszentmihalyi• Studying Those Who Study Us, Diana E. Forsythe• The Visual Display of Quantitative Information, Edward R. Tufte• The Art of Human-Computer Interface Design, Brenda Laurel,

Editor• Object Modeling and User Interface Design: Designing

Interactive Systems, Mark van Harmelen, Editor