47
1 Evaluating Design I: collecting data HCI Lecture 8 Dr Mark Wright Key points: • Goals of evaluation • Ways of thinking about Evaluation and Data Collection • Evaluation by designers or usability experts • Evaluating during use: Cooperative Ethnographic Automated • Evaluation after use: Post task walkthroughs Interviews Surveys •Experimental design Dependent & independent variables limitations Friday, 19 October 12

Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

1

Evaluating Design I: collecting data HCI Lecture 8

Dr Mark WrightKey points:

• Goals of evaluation• Ways of thinking about Evaluation and Data Collection• Evaluation by designers or usability experts• Evaluating during use:

• Cooperative• Ethnographic• Automated

• Evaluation after use:• Post task walkthroughs• Interviews• Surveys

•Experimental design• Dependent & independent variables• limitations

Friday, 19 October 12

Page 2: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Goals of evaluation• Functionality: Assess the extent and accessibility of the system’s

functionality. (i.e. the design of the system should enable users to perform their

intended tasks more easily)• Experience: Assess users’ experience of the interaction and its

impact upon the user. (e.g. how easy the system is to learn, its usability and the user’s

satisfaction with it, Is the experience enjoyable)• Problems: Identify any specific problems with the system. (e.g. unexpected results, confusion amongst users, bugs)

[Dix et al., Section 9.2]2

Friday, 19 October 12

Page 3: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation

Friday, 19 October 12

Page 4: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

Friday, 19 October 12

Page 5: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product?

Friday, 19 October 12

Page 6: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product? • How has this evaluation been arrived at? Is it by comparison to some guidelines, or

from a simulated walkthrough? Is it from use in a realistic context (‘ecological validity’)?

Friday, 19 October 12

Page 7: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product? • How has this evaluation been arrived at? Is it by comparison to some guidelines, or

from a simulated walkthrough? Is it from use in a realistic context (‘ecological validity’)? • Which Method of data gathering is being used? Interview,observation etc

Friday, 19 October 12

Page 8: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product? • How has this evaluation been arrived at? Is it by comparison to some guidelines, or

from a simulated walkthrough? Is it from use in a realistic context (‘ecological validity’)? • Which Method of data gathering is being used? Interview,observation etc• What has been used as a measure? Quantitative (time to complete task, error rate) or

qualitative (ease of use ratings)? Compared to recommendations or alternatives? Consistency?

Friday, 19 October 12

Page 9: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product? • How has this evaluation been arrived at? Is it by comparison to some guidelines, or

from a simulated walkthrough? Is it from use in a realistic context (‘ecological validity’)? • Which Method of data gathering is being used? Interview,observation etc• What has been used as a measure? Quantitative (time to complete task, error rate) or

qualitative (ease of use ratings)? Compared to recommendations or alternatives? Consistency?

Friday, 19 October 12

Page 10: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

3

Questions around evaluation• Who is giving this feedback? A design expert? A fellow programmer? A typical user or

member of a target user group? Is it just one person or a significant proportion of your users?

• When are you getting this feedback? On an early prototype or an established product? • How has this evaluation been arrived at? Is it by comparison to some guidelines, or

from a simulated walkthrough? Is it from use in a realistic context (‘ecological validity’)? • Which Method of data gathering is being used? Interview,observation etc• What has been used as a measure? Quantitative (time to complete task, error rate) or

qualitative (ease of use ratings)? Compared to recommendations or alternatives? Consistency?

• Resources What time, effort and money do you have to conduct the evaluation and make changes to the design?

Friday, 19 October 12

Page 11: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Data Gathering Methods• Interviews• Focus Groups• Questionnaires• Surveys• Observation in the field• Indirect Observation• Experiment

4

Friday, 19 October 12

Page 12: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Timing of the Evaluation

• With respect to the Design Cycle• Pre-design• Early Prototype• Pilot• Product

• With respect to the user• Before Use• During Use• After Use

• Evaluation is iterative and adaptive. Repeated/different to suit needs of stage.

4

Friday, 19 October 12

Page 13: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Evaluation approaches• Analytical evaluation (by experts)

• Cognitive walkthrough (e.g. Polson, 1992)• Heuristic evaluation (e.g. Nielsen & Molich)• Model-based evaluation (e.g. GOMS)

• Usability testing (controlled conditions)• Hypothesis test

• Field studies (natural conditions)• Ethnomethodology

4

Friday, 19 October 12

Page 14: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

How to Approach Evaluation• Set your Goals• Decide on Participants• Decide your approach• Relationships with Participants• Triangulate/mix methods• Use Pilot Studies•

4

Friday, 19 October 12

Page 15: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

5

‘Discount’ methods

(Nielsen & Molich, 1990)

Evaluation by designers: Guideline-based Evaluation (Heuristic Evaluation)

• Interface developers can apply a set of explicit design guidelines (c.f. lecture 1) e.g. Nielsen’s 10 heuristics:• provide feedback• speak the user's language• provide clearly marked exits• be consistent• prevent errors• minimize user memory load• provide shortcuts• use simple and natural dialogue design• provide error recovery• provide help

• Severity ratings may be given, e.g., on 0-4 scale

Friday, 19 October 12

Page 16: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

6

Example: Telephone Self-Service Banking (Nielsen, 1992)

1. S: Enter one for account, three for transfers between accounts ...2. U: 3# (user interrupts system)3. S: Enter account to transfer from.4. U: 1234567890#5. S: Enter account to transfer to.6. U: # (abbreviation for current account)7. S: Enter amount in cents.8. U: 100000#9. S: From account number twelve thirty four ... to account number primary account,

a transfer of one thousand dollars. Press one to confirm.10. U: 1#11. S: You do not have access to use this function.

Friday, 19 October 12

Page 17: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

7How Many Heuristic Evaluators and How Skilled?

(Nielsen, 1992)

Friday, 19 October 12

Page 18: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

8

Evaluation by designers: Cognitive walkthrough• Cognitive walkthrough (Polson et al 1992)

involves stepping through an interaction sequence from the user’s perspective

• Organised around Norman’s description of task execution:1.Goal (question 1, 9a)2.Planning (question 2, 3)3.Action specification (question 4, 5,

7)4.Action execution (question 6)5.Perception of outcome (question 8)6. Interpretation of the outcome (question

8b)7.Evaluation of the outcome (question 8a,

9b)

Friday, 19 October 12

Page 19: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

9

Actions/choices should be ranked according to percentage of users expected to have problems:0 = none1 = some2 = more than half3 = most

Cognitive Walkthrough Form1. Description of user's immediate goal:2. (First/next) action user should take:

• Obvious that action is available? Why/Why not?• Obvious that action is appropriate to goal? Why/Why not?

3. How will user access description of action?• Problem with accessing? Why/Why not?

4. How will user associate description with command?• Problem with associating? Why/Why not?

5. All other available commands less appropriate? • For each, why/why not?

6. How will user execute the command?• Problems? Why/Why not?

7. If time outs, time for user to decide before time out? • Why/Why not?

8. Execute the action. Describe system response:• Obvious that progress has been made toward goal? Why/Why not?• User can access needed info. in system response? Why/Why not?

9. Describe appropriate modified goal, if any:• Obvious that goal should change? Why/Why not?• If task completed, is it obvious? Why/Why not?

Friday, 19 October 12

Page 20: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

10

Evaluation by designers

• Can occur at any stage, but particularly useful early in design (e.g. can apply to mock-up, prototype or not fully functional systems)

• Will identify major problems before users are involved

• Will flag potential areas to check in user evaluation

• Requires expertise (and experts may be expensive) or significant problems may be missed

• Can get large variability between different evaluators• May also generate ‘false positives’ i.e. problems (according to the

guidelines or the cognitive analysis) that would not really be problems in normal use

Friday, 19 October 12

Page 21: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

11

During use: Cooperative Evaluation

• User carries out task with direct feedback to/from evaluator• E.g.

• Get users to ask for help when they get stuck• Ask users what the commands mean• After users read task, ask users how they might solve it• As users consider each command, ask what they think it

does• When users have entered a command, ask what they think

it has done and what the response indicates

Friday, 19 October 12

Page 22: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

12

During use: Cooperative Evaluation

• Creates dialogue between evaluators/designers and users

• Helps provide intentional context necessary to interpret user behaviour

• May learn a lot from just a few users

• May affect normal behaviour

• Intensive effort for user and evaluator

• May be hard to analyse results

http://www.useit.com/papers/guerrilla_hci.html

Friday, 19 October 12

Page 23: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

13

During use: Ethnographic Studies

• Discrete observation• “hanging around”• Can be simple or complex measurement (e.g. time to complete interaction,

apparent degree of frustration or enjoyment) • Ideally should be in normal use situation (e.g. within the company, in the home)

• Excellent method for understanding the “mundane features”, “common sense logic”of everyday work

• May be hard to keep track of what is going on (record for later analysis?)• Requires significant time commitment• May not know what user was trying to do, what caused errors • May be ethical issues if users are not aware of observation, or interference if they

are aware.

Friday, 19 October 12

Page 24: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

14

During use: Automated• Analytics - large scale behaviour, visits, pages, actions, timings, visitors• Activity logs

• Again could be simple (number of ‘undo’ commands) or rich (times for every keystroke – when and for how long did they hesitate?)

• Eye trackingVideo and audio recording with automated analysis (e.g. instrumented meeting room, CSTR/ILCC)

Friday, 19 October 12

Page 25: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Data recording

• Notes, audio, video, photographs

• Notes plus photographs• Audio plus photographs• Video

18

Friday, 19 October 12

Page 26: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

15

After use: Post-task walkthrough• Combine pros (and cons) of co-operative and ethnographic by not interfering

during use, but instead recording and then play back to user asking about intentions and reasons.

• E.g. re-enactment protocol • Subjects were videoed using the www• Later, they were shown the video and asked to comment on their actions

Friday, 19 October 12

Page 27: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

16

Re-enactment protocol - Example

• Subject A downloading Web page from the Timesnewspaper Web site.

• 25 seconds through the data transfer process subject A moves hand to mouse and starts to move the cursor.

• Stops for 10 seconds then places the cursor over the stop button and waits a further 23 seconds.

• The Web page then starts to render and is completed in 5 seconds.

• Afterwards, subject A is asked about their behaviour:

Friday, 19 October 12

Page 28: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

17

A: I had been waiting a while and so I thought something may be wrong and I was going to stop it but then I thought it was about 10 o’clock so lots of people in their offices would be reading the Times.I: So therefore, it would be slow?A: Well yes that is what I thought. I was afraid to stop it if it just going to appear in a few seconds. When lots of people use a web page its slow right.I: And if nobody is using it its fast?A: Yes that’s what I think, when something is obscure I am probably the only person in the world looking at it so it will be fast.I: OK, so how long do you think you would have waited for before stopping?A: Well I was very unsure whether to stop the thingy, didn’t know whether it was right thing to do. I was deliberating over the fact, but I really was unsure what was happening and why it was slow. Then it just appeared.

Re-enactment protocol - Example

Friday, 19 October 12

Page 29: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

18

After use: Interviews

• Asking user to reflect on their experience• Can vary questions to suit the context, or follow up issues in

detail• Understanding preferences, impressions and attitudes• Revealing unanticipated problems

• N.B. Interviews may used to ‘administer’ questionnaire or may be more or less ‘guided’ by preset questions• Improves consistency but may reduce depth

Friday, 19 October 12

Page 30: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

19

After use: Surveys and questionnaires• Questionnaire types

• Background• Attitudes

• Questionnaire design• Open-ended, qualitative

• Don’t restrict answers to evaluators’ expectations, but difficult to analyse

• Rating scale• Easier to analyse, but needs careful design • Typically 5 or 7 point scale measuring agreement/disagreement

with statement(s)• Multiple choice: select one or as many as apply

• There are several ‘standard’ usability questionnaires in the public domain• These can be adapted for specific needs

Friday, 19 October 12

Page 31: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

20

After use: Surveys and questionnairesExample:Indicate your agreement or disagreement with the following statements by circling the appropriate number. The system tells me what to do at every point. Disagree 1 2 3 4 5 Agree It is easy to recover from mistakes Disagree 1 2 3 4 5 Agree I always know what the system is doing Disagree 1 2 3 4 5 Agree

Etc.• Could be based on general usability heuristics• Could be designed to suit specific system characteristics

Friday, 19 October 12

Page 32: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Questionnaires

• Questions can be closed or open• Closed questions are easier to analyze, and may be

done by computer• Can be administered to large populations• Paper, email and the web used for dissemination• Sampling can be a problem when the size of a

population is unknown as is common online

25

Friday, 19 October 12

Page 33: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Questionnaire design

• The impact of a question can be influenced by question order.

• Do you need different versions of the questionnaire for different populations?

• Provide clear instructions on how to complete the questionnaire.

• Strike a balance between using white space and keeping the questionnaire compact.

• Decide on whether phrases will all be positive, all negative or mixed.

26

Friday, 19 October 12

Page 34: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Question and response format

• ‘Yes’ and ‘No’ checkboxes

• Checkboxes that offer many options

• Rating scales

– Likert scales

– semantic scales

– 3, 5, 7 or more points?

• Open-ended responses

27

Friday, 19 October 12

Page 35: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Encouraging a good response

• Make sure purpose of study is clear• Promise anonymity• Ensure questionnaire is well designed• Offer a short version for those who do not have time to

complete a long questionnaire• If mailed, include a stamped addressed envelope• Follow-up with emails, phone calls, letters• Provide an incentive• 40% response rate is high, 20% is often acceptable

28

Friday, 19 October 12

Page 36: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Advantages of online questionnaires

• Responses are usually received quickly• No copying and postage costs• Data can be collected in database for analysis• Time required for data analysis is reduced• Errors can be corrected easily

29

Friday, 19 October 12

Page 37: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

21

After use: Surveys and questionnaires

• Can administer to large numbers• Potential to automate administration and analysis using web forms

• Limited quality of feedback • Depends on having right questions• May only capture whether there were problems, rather than why

• Self-selecting bias those who fill in may not be typical, e.g. provide feedback only if

experienced a problem

• For this (and all other methods) remains issue of analysing and interpreting the data.

Friday, 19 October 12

Page 38: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

22

Observation vs. experiment

Friday, 19 October 12

Page 39: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

22

Observation vs. experiment• So far we have discussed drawing conclusions from observational data,

but it is not always clear we can use this to deduce the correct cause:• E.g. do people prefer your competitor’s system because it works better

or just because of the use of pretty colours?

Friday, 19 October 12

Page 40: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

22

Observation vs. experiment• So far we have discussed drawing conclusions from observational data,

but it is not always clear we can use this to deduce the correct cause:• E.g. do people prefer your competitor’s system because it works better

or just because of the use of pretty colours?• To draw conclusions about causes we need to do an experiment:

• Hypothesise the cause of some outcome • e.g. nice colour -> higher rating

• Set up situation in which you can:• Manipulate the hypothesised cause – independent variable• Hold all other factors equal – controlled variables• Measure the outcome – dependent variable

• Evaluate whether the dependent variable changed significantly (statistically and practically) with the change of independent variable

Friday, 19 October 12

Page 41: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

24

Experiment: Issues

n Subjects in experiments might vary in many ways, e.g. some might be novices, others experts

n As an uncontrolled factor it could confound our result: e.g. worse scores due to more novices in group A

n Standard methods to control for subject differences: Randomise assignment to the two groups Match the experience levels between the groups Use a repeated measures design, i.e. each subject tested under

both conditions and look at their change in score.• May cause other issues, such as learning or fatigue

Use subject difference as second independent factor in 2-way design:

Novice Expert

Nice colour

Nasty colour

Friday, 19 October 12

Page 42: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

23

Experiment• Example: Brewster et al. (1995) using ‘earcons’ to address the ‘slip-off’

problem with graphical buttons

• 12 subjects, each performs data entry task under auditory vs. visual conditions. Several dependent variables including:• Average recovery time from error: 2.00 vs 4.21 (t=3.51, p=.0004)• Average number of codes entered: 64.5 vs 65.5 (t=0.401, p=0.696)

Replace visual feedback (ambiguous) with auditory feedback (unambiguous):

• hear tone for mouse down

• success chord if mouse up occurs on the button (otherwise tone stops)

Friday, 19 October 12

Page 43: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Live Evaluation Example: Palimpsest

• Celebrates 250 Years of English Literature at Edinburgh University

• Texts automatically appear on device as users walk around the City

• Mobile and Desktop App• Mobile app uses GPS• Desktop app uses map interface• Technically complete and designer

trails completed but need a pilot study• You will try out desktop version now

and give feedback via a survey• You may try the mobile version today

or on Monday as there is no lecture34

Friday, 19 October 12

Page 44: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

Palimpsest• www.literarycities.org• Experience app 3-5 mins• Then Take Survey

35

Text

Friday, 19 October 12

Page 46: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

25

Final comments

n Different methods have different strengths and weaknesses: need to chose the approach that will best answer your questions, given the resources available.

n Should always do pilot study before committing major resources:E.g. does the subject understand the task or the

questionnaire? Is the observation method going to interfere too much?

n Often valuable to use two or more contrasting methods

Friday, 19 October 12

Page 47: Evaluating Design I: collecting data HCI Lecture 8€¦ · Goals of evaluation • Functionality: Assess the extent and accessibility of the system’s functionality. (i.e. the design

26

References

Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. Paper presented at the ACM CHI'90 Conference on Human Factors in

Computing Systems, Seattle, WA. (also a handy summary by Nielsen of the main issues here: http://

www.useit.com/papers/guerilla_hci.html) Nielsen (1992) Finding usability problems through heuristic

evaluation.CHI’92.Polson, P. et al (1992) Cognitive walkthroughs: a method for theory based

evaluation of user interfaces. International Journal of Man-Machine Studies, 36:741-773

Hughes, J. et al (1995) The Role of Ethnography in Interactive Systems Design. ACM Interactions April 1995

See also:Course Texts Dix et. al. chapter 9 and Rogers et al chapter 7

Friday, 19 October 12