Introduction to User Experience Methods
Introduction to User Experience Methods
1
Danielle Gobert Cooley @dgcooley
03 April 2013 #IUE2013
About me
• 14 years as user researcher/usability specialist • BE, Biomedical & Electrical Engineering • MS, Human Factors in InformaGon Design • Selected Employers & Clients
03 April 2013 @dgcooley #IUE2013
2
@dgcooley
Important Things to Know About UX Methods
3
03 April 2013 @dgcooley #IUE2013
Please Remember
03 April 2013 @dgcooley #IUE2013
4
The purpose of these methods is to
inform your design.
They are not valida)on methods.
Let Me Repeat That
03 April 2013 @dgcooley #IUE2013
5
The purpose of these methods is to
inform your design.
They are not valida/on methods.
You Are Not Your User
03 April 2013 @dgcooley #IUE2013
6
YOU
NOT YOU
Why Do It? To Avoid Ending Up Here
03 April 2013 @dgcooley #IUE2013
7
One More Thing…
03 April 2013 @dgcooley #IUE2013
8
The purpose of these methods is to
inform your design.
They are not valida/on methods.
Usability Study
9
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers*
• How easy or difficult is it to use the product?
• How efficiently do people use the product?
• Do the users understand the product’s terminology?
• Do the controls make sense?
• Can people find the informaGon they are seeking?
03 April 2013 @dgcooley #IUE2013
10
* Depends in part on prototype fidelity … more on that in a few moments.
How It’s Done
1. Recruit representa)ve end users.
2. Observe impar)ally as they acempt to perform tasks with a prototype.
3. Typically, parGcipants are asked to think aloud as they use the prototype to perform the tasks. This provides insight into WHY certain interface elements are confusing and what might work becer.
03 April 2013 @dgcooley #IUE2013
11
Tips… – Recruiting the right users is key! – Avoid bias everywhere – in task phrasing, your and your observers’ body language, and in verbal questions asked. – Recordings are great, but huge time sucks. – Quantitative studies often
aren’t worth it.
A Note About Prototype Fidelity
03 April 2013 @dgcooley #IUE2013
12
Advantages
• Controlled sedng means easier logisGcs. • Recording and observing is easier, too.
• For the rare quanGtaGve study, lab-‐based tesGng makes it easier to use such tools as Morae or Ovo.
• Lab-‐based tesGng has fewer variables to control, which can be a factor for more rigid studies.
03 April 2013 @dgcooley #IUE2013
13
Disadvantages
• Lab sedng provides no context of use. • Labs can be expensive to rent or build
– (but they don’t have to be) • ParGcipants are someGmes Gmid in a lab sedng
03 April 2013 @dgcooley #IUE2013
14
Field Study
15
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers
• How do environmental circumstances affect the usability of the product?
• How have people worked around issues with the product?
03 April 2013 @dgcooley #IUE2013
16
How It’s Done
1. Recruit representaGve end users.
2. Observe imparGally in the environment in which the product will be used as they acempt to perform tasks with a prototype.
3. Collect arGfacts.
03 April 2013 @dgcooley #IUE2013
17
Advantages
• Gathers contextual data – Ambient light, noise – DistracGons
• ParGcipants usually less inGmidated
• Much more convenient for parGcipants, so recruiGng can be easier
03 April 2013 @dgcooley #IUE2013
18
Contextual Inquiry? Though the terms are often used interchangeably, Contextual Inquiry is actually a
type of field study that follows a very specific format.
Disadvantages
• LogisGcs are more difficult for researchers. • ObservaGon is more challenging. • Recording is more challenging. • Security issues someGmes prohibit photographs or other
recording.
03 April 2013 @dgcooley #IUE2013
19
Card Sort
20
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers
• How would the users organize the product’s content and features?
• Do the users largely agree on how the content should be organized?
• Do the users agree with the categorizaGons proposed by the project team?
03 April 2013 @dgcooley #IUE2013
21
How It’s Done
1. Recruit representaGve end users.
2. IdenGfy content items to be categorized
3. ParGcipants sort the content items into groupings that make sense to them.
03 April 2013 @dgcooley #IUE2013
22
Two types … – In an OPEN card sort, participants create the categories.
– In a CLOSED card sort, the researcher establishes the categories.
Advantages
• Incredibly inexpensive • Done very quickly with remote
evaluaGon tools. • Asynchronous, so scheduling is not an
issue. ParGcipants take part at their convenience.
03 April 2013 @dgcooley #IUE2013
23
Disadvantages
• More complicated with large sets of cards.
• Really, there’s almost no reason NOT to do a card sort, unless you don’t plan to use the results.
03 April 2013 @dgcooley #IUE2013
24
Tree Test
25
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers
• Can users find content in the proposed navigaGon?
• Do the proposed group labels correctly reflect the content within them?
03 April 2013 @dgcooley #IUE2013
26
How It’s Done
1. Recruit representaGve end users.
2. Set up study with IA to be evaluated.
3. Give parGcipants specific content elements to find in that architecture.
03 April 2013 @dgcooley #IUE2013
27
Advantages
• Incredibly inexpensive • Done very quickly with remote
evaluaGon tools. • Asynchronous, so scheduling is not an
issue. ParGcipants take part at their convenience.
03 April 2013 @dgcooley #IUE2013
28
Yep. Just like card sorting!
Disadvantages
• The full IA and nav structure must be created in order to execute a tree test, so there is significant investment in the “prototype,” if you will.
03 April 2013 @dgcooley #IUE2013
29
Tree Test vs. Card Sort – An OPEN Card Sort generates an information architecture. – A CLOSED Card sort usually evaluates high-level labeling. – A Tree Test evaluates findability in an existing information architecture.
OK. This one IS a validation method.
Survey
30
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers
• What is the users’ opinion about various facets of the product?
• How do users believe they use the product?
03 April 2013 @dgcooley #IUE2013
31
How It’s Done
1. Recruit parGcipants 2. Write survey 3. Relax while the data rolls
right in.
03 April 2013 @dgcooley #IUE2013
32
Advantages
• Cheap • Fast • Remote • Easy data collecGon • Large number of parGcipants
03 April 2013 @dgcooley #IUE2013
33
Disadvantages
• Data are self-‐reported. – What people do is not the same as what people SAY they do.
• Good quesGon curaGon is surprisingly challenging.
03 April 2013 @dgcooley #IUE2013
34
Expert Review
35
03 April 2013 @dgcooley #IUE2013
QuesGons It Answers
• Does the product comply with convenGons and best pracGces?
• Has the expert seen issues in the past with any of the design elements or interacGon techniques used in the product?
03 April 2013 @dgcooley #IUE2013
36
How It’s Done
• An experienced UX Specialist analyzes the product, looking for common mistakes or interface elements or interacGons that are not consistent with best pracGces.
03 April 2013 @dgcooley #IUE2013
37
Heuristic Evaluation? Though this term is thrown around a lot, a Heuristic Evaluation is really a specialized type of Expert
Review.
Advantages
• Considerably less expensive than lab or field studies
• Open relaGvely fast – again, as compared to lab or field studies.
03 April 2013 @dgcooley #IUE2013
38
Disadvantages
• No actual end-‐user perspecGve. • Experts vary. J
03 April 2013 @dgcooley #IUE2013
39
Other Techniques
40
03 April 2013 @dgcooley #IUE2013
In No ParGcular Order…
• Journaling Studies – Users keep a journal of their interacGons (good and bad) with the product.
• A/B TesGng – Two different versions of a product are placed online and success rates analyzed.
• AnalyGcs – Web site or product metrics are analyzed to determine user success or failure.
• Personas – DescripGve profiles of representaGve end users. This is actually an output of field research.
03 April 2013 @dgcooley #IUE2013
41
Recap & AddiGonal Resources
• User Experience is important. Really. • These are NOT validaGon techniques! • There are a lot of methods to choose from.
42
03 April 2013 @dgcooley #IUE2013