34
User Interface User Interface Agents Agents Roope Raisamo ([email protected] ) Department of Computer and Information Sciences University of Tampere http://www. cs . uta . fi /~ rr /

User Interface Agents Roope Raisamo ([email protected])[email protected] Department of Computer and Information Sciences University of Tampere rr

Embed Size (px)

Citation preview

Page 2: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

User Interface AgentsUser Interface Agents

A user interface agent guides and helps the user– Many user interface agents observe the

activities of the user and suggest better ways for carrying out the same operations

– They can also automate a series of operations based on observing the users

Many user interface agents are based on the principles of programming by example (PBE)

Page 3: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Two examples of Two examples of user interface agents:user interface agents:

EagerEagerLetiziaLetizia

Page 4: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Eager – automated macro Eager – automated macro generatorgenerator

Allen Cypher, 1991http://www.acypher.com/Eager/ Observes the activities of the user and tries

to detect repeating sequences of actions. When such a sequence is detected, offers a possibility to automate that task.

like an automated macro generator this kind of functionality is still not a part of

common applications, even if it could be.

Page 5: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

EagerEager

Eager observes repeating sequences of actions

When Eager finds one, it jumps on the screen and suggests the next phase

Page 6: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

EagerEager

When all the phases suggested by Eager have been shown and accepted, the user can give Eager the permission to carry out the automated task.

Page 7: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Letizia – a browser companion Letizia – a browser companion agentagent

Letizia observes the user and tries to preload interesting web pages at the same time as the user browses through the web

Page 8: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

LetiziaLetizia

Page 9: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

LetiziaLetizia

Traditional browsing leads the user into doing a depth first search of the Web

Letizia conducts a concurrent breadth-first search rooted from the user's current position

Page 10: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

The appearence of agentsThe appearence of agents

Page 11: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

The appearance of an agentThe appearance of an agent

The appearance of an agent is a very important feature when a user tries to find out what some agent can do.

It is a bad mistake to use such an appearance that the user thinks an agent to be more intelligent than it really is.

The appearance must not be disturbing.

Page 12: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Computer-generated talking Computer-generated talking head head

one of the most demanding forms of agent presentation

a human head suggests the agent to be rather intelligent

a talking head probably is the most natural way to present an agent in a conversational user interface.

Page 13: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Drawn or animated charactersDrawn or animated characters

the apperance has a great effect on the expectations of the user– a paper clip vs. a dog vs. Merlin the Sorceror

Continuously animated, slowly changing or static presentation

Page 15: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Auditory presentationAuditory presentation

An agent can also be presented only by voice or sound, the auditory channel– ambient sound– beeps, signals– melodies, music– recorded speech– synthetic speech

Page 16: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Haptic presentationHaptic presentation

In addition to auditory channel, or to replace it an agent can present information by haptic feedback

Haptic simulation modalities– force and position– tactile– vibration– thermal– electrical

Page 17: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Haptic output devicesHaptic output devices

Inexpensive devices: – The most common haptic devices

are still the different force-feedback controllers used in computer games, for example force-feedback joysticks and wheels.

– In 1999 Immersion Corporation’s force feedback mouse was introduced as Logitech Wingman Force Feedback Gaming Mouse

– In 2000 Immersion Corporation’s tactile feedback mouse was introduced as Logitech iFeel Tactile Feedback Mouse

Page 18: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Haptic output devicesHaptic output devices

More sophisticated devices:– SensAble Technologies: PHANTOM– Immersion Corporation: Impulse Engine– Often very expensive, and non-ergonomic.

VTi CyberForce Impulse Engine 2000VTi CyberTouch

PHANTOM

Page 19: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

No direct presentation at allNo direct presentation at all

An agent helps the user by carrying out different supporting actions– e.g., prefetching needed information,

automatic hard disk management, … An indirectly controlled background

agent– question: How to implement this indirect

control?– multisensory input: the agent is observing a

system, an environment, or the user

Page 20: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Related user interface Related user interface metaphors:metaphors:

Conversational User Conversational User InterfaceInterface

Multimodal User InterfaceMultimodal User Interface

Page 21: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Conversational User Conversational User InterfacesInterfaces

Why conversation?– a natural way of communication– learnt at quite a young age– tries to fix the problems of a direct

manipulation user interface Conversation augments, not necessarily

replaces a traditional user interface– the failure of Microsoft Bob– Microsoft Office Assistant

Page 22: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Microsoft Office AssistantMicrosoft Office Assistant

Office assistant tries to help in the use of Microsoft Office programs with a variable rate of success.

The user can choose the appearance of the agent– unfortunately, this has no effect

on the capabilities of the agent A paper clip most likely is a

better presentation for the current assistant than a Merlin character.

Page 23: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Multimodal User InterfacesMultimodal User Interfaces

”Multimodal interfaces combine many simultaneous input modalities and may present the information using synergistic representation of many different output modalities” [Raisamo, 1999]

Page 24: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Multimodal User InterfacesMultimodal User Interfaces

An agent makes use of multimodality when observing the user:– speech recognition

reacts on speech commands, or observes the user without requiring actual commands

– machine vision, pattern recognition: recognizing facial gestures recognizing gaze direction recognizing gestures

Page 25: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Multimodal User InterfacesMultimodal User Interfaces

a specific problem in multimodal interaction is to combine the simultaneous inputs.– this requires a certain amount of task

knowledge and ”intelligence”– this way every multimodal user interface is

at least in some respect a user interface agent that tries to find out what the user wants based on the available information

Page 26: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

A high-level architecture for A high-level architecture for multimodal user interfacesmultimodal user interfaces

Inputprocessing- motor- speech- vision- …

Outputgeneration- graphics- animation- speech- sound- …

Mediaanalysis- language- recognition- gesture- …

Mediadesign- language- modality- gesture- …

Interactionmanagement

- media fusion

- discoursemodeling

- planrecognitionandgeneration

- usermodeling

- presentationdesign

Ap

plic

atio

n in

terf

ace

Adapted from [Maybury and Wahlster, 1998]

Page 27: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

ModelingModeling

[Nigay and Coutaz, 1993]

Page 28: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

PutPut– That – That – There– There[Bolt, 1980]

Page 29: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Example: Digital Smart KioskExample: Digital Smart Kiosk

Smart Kiosk was a research project at Compaq-Digital Cambridge Research Laboratory in which an easy-to-use information kiosk has been built to be used by all people

Combines new technology:– machine vision, pattern recognition– speech synthesis (DECtalk)– speech recognition– animated talking head (DECface)

[Christian and Avery, 1998]

Page 30: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Example: Digital Smart KioskExample: Digital Smart Kiosk

Vision

DECface

Netscape Navigator

Active vision zone

Touchscreen

Page 31: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Example: Digital Smart KioskExample: Digital Smart Kiosk

Page 32: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Example: Digital Smart KioskExample: Digital Smart Kiosk

Page 33: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

Example: Digital Smart KioskExample: Digital Smart Kiosk

Page 34: User Interface Agents Roope Raisamo (rr@cs.uta.fi)rr@cs.uta.fi Department of Computer and Information Sciences University of Tampere rr

ReferencesReferences

[Bolt, 1980]  Richard A. Bolt, Put-that-there. SIGGRAPH ‘80 Conference Proceedings, ACM Press, 1980, 262-270.

[Christian and Avery, 1998] Andrew D. Christian and Brian L. Avery, Digital Smart Kiosk project. Human Factors in Computing Systems, CHI ’98 Conference Proceedings, ACM Press, 1998, 155-162.

[Nigay and Coutaz, 1993]  Laurence Nigay and Joëlle Coutaz, A design space for multimodal systems: concurrent processing and data fusion. Human Factors in Computing Systems, INTERCHI ’93 Conference Proceedings, ACM Press, 1993, 172-178.

[Raisamo, 1999]   Roope Raisamo, Multimodal Human-Computer Interaction: a constructive and empirical study. Ph.D. dissertation. Report A-1999-13, Department of Computer Science, University of Tampere. http://granum.uta.fi/pdf/951-44-4702-6.pdf

[Maybury and Wahlster, 1998]  Mark T. Maybury and Wolfgang Wahlster (Eds.), Readings in Intelligent User Interfaces. Morgan Kaufmann Publishers, 1998.