22
Using a simulated user to explore human-robot interfaces ? Frank Ritter & Dirk Van Rooy Applied Cognitive Science Lab at Penn State ? Robert St. Amant North Carolina State University [email protected] - [email protected] - [email protected]

Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Using a simulated user to explore human-robot interfaces

? Frank Ritter & Dirk Van RooyApplied Cognitive Science Lab at Penn State? Robert St. Amant North Carolina State University

[email protected] - [email protected] - [email protected]

Page 2: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Simulated user to test interfaces

?Glean, EPIC, ACT-R/PM, APEX? Interact indirectly with interface

– Abstract copy (GLEAN, EPIC)– Special UIMS (ACT-R/PM, APEX)

?Results in part determined by accuracy of simulation of interface

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 3: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Direct access to interface

?Cognitive model - ACT-R 5?Eyes & hands - SEGMAN?Allows direct interaction between

cognitive model and interface

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 4: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Act-r 5 and Segman

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 5: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Segman v3.1

?Sensor module & Effector module– takes pixel-level input– runs data through processing algorithms– builds a structured representation – generates mouse and keyboard gestures

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 6: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Segman v3.1 Diagram

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 7: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

ACT-R 5 and Segman demo 1

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 8: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

ACT-R 5 and Segman demo 2

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 9: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Introducing a simulated user

Quantative tool to guide the design process of human-robot interfaces

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

? Urban Search and Rescue robots (USR)

Page 10: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Urban Search and Rescue - 1

Center for Robot-Assisted Search and Rescue

Teleoperated robots

Mixed-initiative HRI

Page 11: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Urban Search and Rescue - 2

Page 12: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

3D Driving Game? Direct interface

– “Inside-out” driving

? Driving behavior – Real-time– Interactive environment

? Extensible code– Environment– Interface

? Works with unmodified 3D Driving

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

www.theebest.com/games/3ddriver/3ddriver.shtml

Page 13: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Segman and ACT-R 5 integration

? Segman– position-in-lane

information – left lane + right edge

of the road– midpoint at 5.5

degrees below the horizon

? ACT-R 5– Takes Midpoint– Steers right or left– Brakes, accelerates

?Buffer stuffing galore

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 14: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Screenshot of desktop

GNU Emacs windowAllegro Debug

3D Driving Game

Page 15: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

DUMAS

?Driver User Model in Act-r&Segman?About 30 production rules?Restricted model of driving behavior

– Does not use PM fully– Does not learn yet– ……

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 16: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

DUMAS demo

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 17: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Two demonstrations

?Speed and multi-tasking?Speed

– Three sets of 10 runs– High, medium and slow speed

?Multi-tasking– Standard condition = Slow speed– Worried condition

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 18: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

3

6

9

12La

ne d

evia

tion

0

1

2

3

Tota

l Dri

ving

Tim

e

Figure 4. Speed Demonstration: Lane deviation (in degrees) and total driving time (in minutes) of DUMAS in function of speed. Slow corresponds to a driving speed within the range of 15-20, medium 20-25, and fast 30-35 as measured on thespedometer in the simulation.

Speed Demonstration

Slow Medium Fast Slow Medium Fast

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 19: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

3

6

9

12La

ne d

evia

tion

0

1

2

3

Tota

l Dri

ving

Tim

eFigure 5: Lane deviation (in degrees) and total driving time (inminutes) of DUMAS in the Standard and Worried condition.

Multi-tasking

Standard Worried Standard Worried

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 20: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Conclusions

?Quantative tool for HRI?USR tasks are difficult because

– Multi-tasking, interference– Hard vision problems– Noisy, ambiguous, poor quality display

?Surprising parallels– Course corrections

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 21: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Future

?Extend DUMAS– To include more PM theory– To include more HRI subtasks– Multi-tasking (multiple robots)

?Apply to actual HRI’s?Develop theory of HRI development

ritterritter..istist..psupsu..eduedu//acsacs--lab/lab/

Page 22: Using a simulated user to explore human-robot interfacesact-r.psy.cmu.edu/wordpress/wp-content/themes/ACT-R/workshops/… · Segman v3.1?Sensor module & Effector module – takes

Thank you

More on this can be found at ritter.ist.psu.edu/acs-lab/

[email protected] - [email protected] - [email protected]

This work was sponsored by the Space and Naval Warfare Systems Center San Diego, grant number N66001-1047-411F. The content of the information does not necessarily reflect the position or the policy of the Government, and no official endorsement should be assumed.