36
You Are Not Alone: How Authoring Tools Can Leverage Activity Traces to Help Users, Developers & Researchers Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Bjoern Hartmann Stanford HCI Lunch 8/19/2009

  • Upload
    whitley

  • View
    43

  • Download
    0

Embed Size (px)

DESCRIPTION

You Are Not Alone: How Authoring Tools Can Leverage Activity Traces to Help Users, Developers & Researchers. Bjoern Hartmann Stanford HCI Lunch 8/19/2009. The Idea (Not New). Record what users are doing while using an authoring tool. (At what level of detail? Privacy? Confidentiality?) - PowerPoint PPT Presentation

Citation preview

Page 1: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

You Are Not Alone:How Authoring Tools Can Leverage Activity Traces to Help Users, Developers & Researchers

Bjoern HartmannStanford HCI Lunch

8/19/2009

Page 2: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

The Idea (Not New)

• Record what users are doing while using an authoring tool. (At what level of detail? Privacy? Confidentiality?)

• Extract relevant patterns from these traces. (What patterns? Automatically or with user involvement?)

• Aggregate data from many users. (How? What is the right group boundary?)

• Present useful data back to either the users, or the developers. (What is useful? In what format? Feedback loop or canned answers?)

Page 3: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 4: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 5: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Algorithms:Recommender

systems,Data mining, PL

Social Perspective:

Crowd sourcing,User communities

Domain:Authoring tools

Page 6: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Potential benefits• For users:

– Gain expertise through tutorials (Grabler SIGGRAPH09) & tool suggestions (Matejka UIST09)

– Understand expert practices (2draw.net)– Improved documentation (Stylos VLHCC09)– Help with debugging

(Kim, SIGSOFT06; Livshits SIGSOFT05)• For tool developers & researchers:

– Understand user practices (Terry CHI08)– Understand program behavior in the wild

(Liblit PLDI05)– Understand usability problems in the wild (Hilbert 2000)

Page 7: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

INSTRUMENTING IMAGE MANIPULATION APPLICATIONS

Page 8: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Example: 2draw.net

8

Page 9: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining 2draw

• Record: canvas state over time• Extract: snapshots of drawing• Aggregate: no aggregation across users• Present: browse timeline of snapshots

• Benefit: understand technique behind drawings

Page 10: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Terry et al., InGimp (CHI 2008)

http://www.ingimp.org/statsjam/index.php/Main_Page

Page 11: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining InGimp

• Record: application state / command use• Extract: • Aggregate: send usage sessions to remote db• Present: usage statistics

• Benefit: understand aggregate user profiles

Page 12: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Own Experiment: Instrumenting Processing

• Use Distributed Version Control System to record a new revision every time the user compiles/runs program

Page 13: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Grabler et al., Photo Manipulation Tutorials (SIGGRAPH 09)

Page 14: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 15: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining PMT

• Record: application state / command use / screenshots

• Extract: high-level commands• Aggregate: ---• Present: graphical, annotated tutorial

• Benefit: higher quality, lower cost tutorials

Page 16: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

CommunityCommands (Matjeka, UIST09)

Page 17: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

IMPROVED DOCUMENTATION

Page 18: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Stylos, Jadeite (VL/HCC 2009)

Page 19: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 20: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 21: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Documentation Algorithm

• For each file in source code corpus of Processing projects (existing documentation, forum posts, web search), calculate # of function calls for all known API functions (use hash table fn_name->count)

• Rescale font size on documentation page by relative frequency of occurrence in corpus

Page 22: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

DEBUGGING

Page 23: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Cooperative Bug Isolation (Liblit, UCB)

23

Page 24: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining CBI

• Record: sparse sampling of application state• Extract: ---• Aggregate: establish correspondence between

different reports• Present: priority list of runtime bugs to developer

• Benefit: understand real defects in released software

Page 25: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

BugMem (Kim, UCSC)

Page 26: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining BugMem

• Record: --- (use existing source code repository)• Extract: bug signature and fixes• Aggregate: ?• Present: list of bugs in repository that match

fixes in same repository

• Benefit: find bugs in existing code that your team has fixed in the past

Page 27: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

DynaMine (Livshits @ Stanford)

Page 28: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

‘;l’;l

Page 29: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 30: Bjoern Hartmann Stanford HCI Lunch 8/19/2009
Page 31: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Examining HelpMeOut

• Record: source code at every compilation step• Extract: error messages and code diffs• Aggregate: collect fixes from many users in db;

explanations from experts• Present: list of fixes in db that match user’s error

and code context; explanations when available

• Benefit: find fixes that others have used to correct similar problems in the past

Page 32: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

A Design Space for Finding Answers to Questions from Online Data

How many answersare needed?

When are answersavailable?

Immediately(Already published)

Near real-time

With Latency

1 10 100

Who publishesanswers?

Authority Expert Peer

What reporting format?

Individual answers Aggregate data

Can questionerseek clarification/detail?

How many answersare shown / available? 1 10 100

How was answer authored? Explicitly Implicitly

Yes No

Anyone?

Page 33: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

HelpMeOut

How many answersare needed?

When are answersavailable?

Immediately(Already published)

Near real-time

With Latency

1 10 100

Who publishesanswers?

Authority Expert Peer

What reporting format?

Individual answers Aggregate data

Can questionerseek clarification/detail?

How many answersare shown / available? 1 10 100

How was answer authored? Explicitly Implicitly

Yes No

Anyone?

Page 34: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Stack Overflow

How many answersare needed?

When are answersavailable?

Immediately(Already published)

Near real-time

With Latency

1 10 100

Who publishesanswers?

Authority Expert Peer

What reporting format?

Individual answers Aggregate data

Can questionerseek clarification/detail?

How many answersare shown / available? 1 10 100

How was answer authored? Explicitly Implicitly

Yes No

Anyone?

Page 35: Bjoern Hartmann Stanford HCI Lunch 8/19/2009

Non-Example: Scratch (MIT)

35

Scratch authoring environment with “Share” button

Scratch web site lists shared projects

Page 36: Bjoern Hartmann Stanford HCI Lunch 8/19/2009