25
CS160 Discussion Section Final review David Sun May 8, 2007

CS160 Discussion Section Final review David Sun May 8, 2007

Embed Size (px)

Citation preview

CS160 Discussion Section

Final reviewDavid Sun

May 8, 2007

Design Patterns Pattern Style (presented in class)

1. Pattern Title2. Context3. Forces4. Problem Statement5. Solution

• Solution Sketch

6. Other Patterns to Consider Tips:

1. Know the pattern format2. We are not fussy on terminology but make sure the

description covers the major conceptual components.

Exercise: Design Pattern for…

• Pick an object and come up with a design pattern in 15 minutes, eg.– Bike– Coffee mug– Desk lamp

Object Action Model

• An interaction/cognitive model for how users interact a system.

• Elements:– Task: the universe of objects the

user works with and the actions they apply to those objects.

– Interface: metaphoric representations of objects and actions.

OAI Example: the Calculator

Reals

Addition op First number Second number

OperationsAdd two numbers

Pick 2 numbers Perform addition operation

Actions(intention)

Objects(universe)

TASK

Object Action Model

Buttons Display

CalculatorOperate the calculator

Press 1 Press +

Actions(plan)

Objects(metaphor)

Interface

Write out an equation

Press 2 Press =

Infovis• Information tasks

– Specific fact finding– Extended fact finding– Open-ended browsing– Exploration of availability

• Info search 4-phase pattern1. Formulation2. Action3. Review of results4. Refinement

Infovis

• Tasks for a visualization system1. Overview: Get an overview of the collection2. Zoom: Zoom in on items of interest3. Filter: Remove uninteresting items4. Details on demand: Select items and get

details5. Relate: View relationships between items6. History: Keep a history of actions for undo,

replay, refinement7. Extract: Make subcollections

Infovis

• Some key concepts– Query building: visual builders and QBE– Multidimensional scaling– Focus + context

• Distortion• Fish-eye lenses• Overview + details

– Network visualization– Animation– 3D

User Testing

Evaluation Methodologies

• Expert analysis– Cognitive Walkthrough– Heuristic evaluation– Model-based evaluation (GOMS)

• User participation– Lab studies– Field studies

Ethical Considerations• Sometimes tests can be distressing

– users have left in tear (embarrassed by mistakes)

• You have a responsibility to alleviate– make voluntary with informed consent– avoid pressure to participate– let them know they can stop at any time

[Gomoll]

– stress that you are testing the system, not them

– make collected data as anonymous as possible

• Often must get human subjects approval

Measuring User Preference• How much users like or dislike the system

– can ask them to rate on a scale of 1 to 10– or have them choose among statements

• “best UI I’ve ever…”, “better than average”…– hard to be sure what data will mean

• novelty of UI, feelings, not realistic setting, etc.

• If many give you low ratings -> trouble• Can get some useful data by asking

– what they liked, disliked, where they had trouble, best part, worst part, etc. (redundant questions)

BA

Comparing Two Alternatives

• Between groups experiment– two groups of test users– each group uses only 1 of the systems

• Within groups experiment– one group of test users

• each person uses both systems• can’t use the same tasks or order (learning)

– best for low-level interaction techniques

• Between groups will require many more participants than a within groups experiment

• See if differences are statistically significant– assumes normal distribution & same std. dev.

Experimental Details

• Order of tasks– choose one simple order (simple -> complex)

• unless doing within groups experiment

• Training – depends on how real system will be used

• What if someone doesn’t finish– assign very large time & large # of errors

• Pilot study– helps you fix problems with the study– do 2, first with colleagues, then with real users

Errors and Help

Types of errors

• Mistakes– User intended to do what they did, and it

led to an error. User would probably do the same thing again.

• Slips– User did not mean to do what they did. They

can recover by doing it differently again.– Slips are not just for beginners. Experts

often make them because they devote less conscious attention to the task.

Minimizing Error

• User errors: – Use Intuitive (from the users domain of knowledge)

command names. – Include short explanations as “tool tips”. – Put longer explanations in help system.

• Recognition over recall– Easier to select a file icon from a folder than to

remember and type in the filename. – Auto-completion can help fix this.

• Use appropriate representations– E.g. graphical file selector good for choosing individual

files– Textual file names support automation, richer

organization (using command line options).

Types of errors

• Mistakes– User intended to do what they did, and it

led to an error. User would probably do the same thing again.

• Slips– User did not mean to do what they did. They

can recover by doing it differently again.– Slips are not just for beginners. Experts

often make them because they devote less conscious attention to the task.

Description errors

• Description error:– The action is insufficiently specified by the

user. – User may not know all the command line

switches, or all the installation options for a program.

• Solution:– Warn the user that the command is

ambiguous, or “unusual”. Provide help about options in several standard ways.

Capture error

• Capture error: (aka the tongue twister error)

– Command sequences overlap, and one is more common.

– User reflexively does the common one when trying to do the unusual one.

– E.g. try typing “soliton” very fast.• Solution

– be aware of and test for this error. Try different command names.

Mode errors

• Mode errors: – User forgets what mode they’re in, and does

the command appropriate for another mode.

– Digital watches, VCRs etc.• Several attributes:

– There aren’t enough command keys for all the operations – so the mode determines what each button does.

– There isn’t enough display space to provide strong feedback about the mode.

Mode errors

• Solutions:– Strive for consistent behavior of buttons across

modes.– Provide display feedback about behavior of keys in

the current mode. – Provide an option for scrolling help tips if possible. – Allow the device to be programmed externally

(e.g. from a PC with Bluetooth).– If you don’t have a tiny screen, then make the context clear!

• i.e. use color, tabs, navigation graphics etc. to make clear to the user “where” they are in the interface.

Detecting Errors• The earlier the better:

– Check for consistency whenever possible (“asserts” for user input).

– If there’s a high risk of error, check for unusual input, or for common slips (spelling correction).

• E.g. google’s “did you mean XX?” response

Help

• Types of help :– Task specific– Quick reference– Full explanation – Tutorial

• Key concepts:– Sandboxing– Context-sensitive help– Adaptive help