105
Embodied Conversational Agents and Affective Computing EASSS 2012 http://acai.lip6.fr/ Alexandre Pauchet [email protected] Nicolas Sabouret [email protected]

T7 Embodied conversational agents and affective computing

Embed Size (px)

DESCRIPTION

14th European Agent Systems Summer Schoola

Citation preview

Page 1: T7 Embodied conversational agents and affective computing

Embodied Conversational Agentsand

Affective Computing

EASSS 2012

http://acai.lip6.fr/

Alexandre Pauchet [email protected] Sabouret [email protected]

Page 2: T7 Embodied conversational agents and affective computing

http://acai.lip6.fr/

French working group ACAI

Page 3: T7 Embodied conversational agents and affective computing

GT ACAI

2004: GT ACA (ECA) 2012: GT ACAI

GDR I3, SMA from AFIA Affects, Artificial Companions and Interaction

Thematic meetings Corpus Virtual agents Affective computing ....

From 10 to 30 people

Page 4: T7 Embodied conversational agents and affective computing

GT ACAI

Workshop WACA 2005, 2006, 2008, 2010 Chatbots, dialogical agents Assistant agents Virtual agents Emotions

WACAI 2012 in Grenoble Affects, Artificial Companions and Interaction

Web Site: http://acai.lip6.fr/ GT ACA: http://www.limsi.fr/aca/

Mailing list: [email protected]

Page 5: T7 Embodied conversational agents and affective computing

Plan

Embodied Conversational AgentsIntroduction to ECAs, talking heads, gestural agents,

deploying environments, assistant agents, usefulness of embodiment, human-ECA interaction

Affective ComputingIntroduction to affective computing, affective models,

ECAs and emotions

Page 6: T7 Embodied conversational agents and affective computing

Embodied Conversational Agents

Page 7: T7 Embodied conversational agents and affective computing

Introduction

Page 8: T7 Embodied conversational agents and affective computing

Definitions

ECA : "Embodied Conversational Agent"IVA : "Intelligent Virtual Agent"

Agent

Conversational

Embodied

Perception (awareness)

Decision(rationality, AI)

Expression(affects)

Agents La Cantoche™ Aibot Sony™

Rational Proactivity

Multimodal interaction Social

Personification Situated

Multimodality- Graphic User Interface (GUI)

- Natural Language Processing (NLP)...

Microsoft™

Icons Virtual Characters Robots

Page 9: T7 Embodied conversational agents and affective computing

Roles of ECAs

ECAs are “Interactive Virtual Characters” that are situated in mediated (often distributed) environments.

They can play four main roles:

Assistants to welcome users and to assist them in understanding and using the structure and the functioning of applications

Tutors for students in human-learning mediated environments, or for patients in psychological/pathological monitoring systems

Partners for actors in virtual environments: partner or adversary in a game, participant in creative design groups, member of a mixed-initiative community, …

Companions as a friend in a long term relashionship

Page 10: T7 Embodied conversational agents and affective computing

Talking heads

Fixed – Realistic

Expressions – LipSynch

Emotions

Gestural Agents

Fixes/floating – moving arms

Dialogue – Deictic – Sign language

Tutoring – Assistance

Situated Agents

Complete mobile characters

Virtual/Augmented reality

Training – Action

GRETA (Pélachaud LTCI) STEVE (Rickel et al USC) VTE (Gratch et al USC)

Categories of ECAs

Page 11: T7 Embodied conversational agents and affective computing

© CANAL+

APPLICATIONS WEB PAGES AMBIENT

Educ

atio

nA

ssis

tan c

e

Mix

ed

Com

mun

itie

sW

elco

min

g

Vir

tual

Rea

lity

Smar

t O

bjec

ts

Deploying environments

Page 12: T7 Embodied conversational agents and affective computing

Agent models: interaction performance/efficiency of models Interaction models with users: Multi-modality, H/M Dialogue

Models of cognitive agents: BDI logics and planning, affective logics, ...

Task models and user models: Symbolic Representation and Reasoning

Human modelling: ecological relevance of models Capture

Representation

Reproduction

Evaluation

Application

of physiology

of expressions

of behaviours

of humans

Scientific issues

Page 13: T7 Embodied conversational agents and affective computing

Computer Science and Natural Language Processing Multi-Agent Systems

Human-Machine Interaction/Interfaces

Human-Machine Dialogue

Humanities and Society Sciences Psychology of Language Interaction

Ergonomic psychology

Scientific communities

Page 14: T7 Embodied conversational agents and affective computing

Projects and teams

European project SEMAINE

(http://www.semaine-project.eu/)

LTCI (France): Greta

LIMSI (France): MARC

Bielefield University (Germany): MAX

USC (USA): STEVE, VTE

Page 15: T7 Embodied conversational agents and affective computing

Independent of the embodiment level

Non verbal behaviours (postures, gestures, ...):

Theoretical approach (related works) Empirical approach (corpus analysis)

Generated animations

From an intention (triggering action) Automatically and cyclically

Design of animated characters(ECAs or avatars)

<configuration> <timecode>10</timecode> <body>front</body> <head>front</head> <eyes>open happy</eyes>… … … <leftArm>hello</leftArm> <rightArm>hip</rightArm> <speech>Hello!</speech></configuration>

Corpora Representation Reproduction

Evaluation

Userfeeling?

Page 16: T7 Embodied conversational agents and affective computing

Nikita by Kozaburo © 2002

Done with Poser 5 / No PostworkModel: DAZ-3D Victoria-3Skin Texture:Mec4D

Beauty contest for Embodied Conversational agentswww.missdigitalworld.com

Talking heads

Page 17: T7 Embodied conversational agents and affective computing

G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/

Visem: elementary unit of the position of the muscles of the human faces (3D model)

Experimental settings: Small red balls manually placed on the face 3 to 5 cameras Training corpus

Models based on visems

Page 18: T7 Embodied conversational agents and affective computing

Torsion model of the face and the neck

The positions of the red balls are set manually

Pattern recognition is used to monitor ball movements

Extraction of key frames automatic and manual

Transitions between expressions

G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/

Page 19: T7 Embodied conversational agents and affective computing

Analysis and synthesis

Visems + 3D model are used to (re-)generate expressions

6 to 11 parameters

Validation by comparing to real expressions

G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/

Page 20: T7 Embodied conversational agents and affective computing

‘Talking head’ project (LIMSI)

Based on visems Video corpus of 21

utterances:

... C'est u

i ci ...

« C'est /phonem/ ici » (Ex: « C'est u ici »)

Page 21: T7 Embodied conversational agents and affective computing

Greta (Prudence, Obadiah, Poppy and Spike)LTCI – Telecom ParisTech

MARCLIMSI

Realistic talking heads

Page 22: T7 Embodied conversational agents and affective computing

http://marc.limsi.fr/

Gestural agents

Page 23: T7 Embodied conversational agents and affective computing

Design of realistic gestural agents and avatars

Annotated corpus of videos (ex: Anvil)

Psychological knowledge (Related works)

MARC (LIMSI)

GRETA (LTCI)

Page 24: T7 Embodied conversational agents and affective computing

LEA Agents (LIMSI)• 2D cartoon characters (110 GIF files)• Easily integrated into Java applications and JavaScript• Description language for high-level behaviours

Lea Marco Jules Genius

2D cartoon agents: Lea

Page 25: T7 Embodied conversational agents and affective computing

deictic gestures

iconic gestures adapters Emblematic gestures

transparent GIF files

Lea: gestural expressiveness

Page 26: T7 Embodied conversational agents and affective computing

<?xml version='1.0' encoding='utf-8'?>

<configurationsequence nbrconfig="1">

<configuration> <timecode>10</timecode>

<body>front</body> <head>front</head> <eyes>open happy</eyes> <gaze>middle</gaze>

<facialExpression>close</facialExpression>

<bothArms>null</bothArms> <leftArm>hello</leftArm> <rightArm>hip</rightArm> <speech>Hello!</speech>

</configuration></configurationsequence>

Number of configurations

= attitudes

An

att

itu

de

Specifying attitudes and animations in XML

Page 27: T7 Embodied conversational agents and affective computing

Tabl

e :

F réd

éric

Ver

nier

LIM

SI-C

NR

S

Dynamic Deictic

Page 28: T7 Embodied conversational agents and affective computing

GLA-TEK-ZIM

KUB-OLO-ZIM

Designation of the DOM objects

Page 29: T7 Embodied conversational agents and affective computing

APPLICATIONS

WEB PAGES

AMBIENT

Deploying environments

Page 30: T7 Embodied conversational agents and affective computing

Chat bots in the Internet

Chat bot = “chatterbox” + “robot” A program capable of NL dialogue

Usually key-word spotting No dialogue session or context

No embodiment Goals: only chat

ELIZA (J. Weizenbaum,1965)

Page 31: T7 Embodied conversational agents and affective computing

Assistant agents (application/web)

Page 32: T7 Embodied conversational agents and affective computing

Social avatars on the web

Social avatarsThe recent evolution of the communication over the web has prompted the development of so-called “avatars” which are graphical entities dedicated to the mediating process between humans in distributed virtual environments (ex: the “meeting services”)

Contrary to an ECA which is the personification of a computer program (an artificial agent) an avatar is the personification of a real human who controls the avatar.

(Avatar = Mask) ≠ Agent

Second lifePresented as a « metaverse » by its developer, Linden Lab, Second Life is a web-based virtual simulation of the social life of ordinary people.

5 000 000 avatars were created Permanently , ~ 200 000 visitors are logged

Page 33: T7 Embodied conversational agents and affective computing

Mixed communities

Project : « Le deuxième Monde » 1997-2001• Company: Canal +™• Head: Sylvie Pesty (LIG – Grenoble)• PhD: Guillaume Chicoisne (LIG – Grenoble)

Mixed Community• People chatting and purchasing in a Virtual World• ECA interacting with the people

Embodied Conversational agent

Avatars of users

© CANAL+

© CANAL+

Navigation and chat interface

Page 34: T7 Embodied conversational agents and affective computing

Metaphor: meetings

Definition : computerised environments that integrates transparently humans and artificial agents within meetings

Mixed initiative communities: brainstorm, design, decision, …

Virtual Training Environments (VTE) : training, simulation, …

Deploying environments

Smart Room: physical place with augmented reality

Web-based : distributed and mediated environments

Animated characters

Avatars : virtual characters driven by humans

Agents : virtual characters which embodies artificial agents

Mixed communities andHeterogeneous MAS

Page 35: T7 Embodied conversational agents and affective computing

Mixed training environments Avatars ECAs

Task: virtual firemen Recognition of fire Activity reports

Interaction Emotion management Gestures and facial expressions NL dialogue management

M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 2005.11.15 — www.irit.fr/GRIC/VR/

+

Social interaction in virtual environments

Page 36: T7 Embodied conversational agents and affective computing

The study of videos enabled to highlight how deictic gestures and gazes towards items are used concomitantly with natural language words ("here", "it", "now", ...)

M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 15 nov 2005 — www.irit.fr/GRIC/VR/

Realistic behaviour of agents

Page 37: T7 Embodied conversational agents and affective computing

From assistant agents to ambient environments

Transporting the Function of Assistance from stand-alone applications to room-based ambient environments

Application taken from the DIVA toolkit (LIMSI)

Page 38: T7 Embodied conversational agents and affective computing

DIVA display devices of the HD TV Screen Large touch Screen Portable touch Screen

Presentation of any Internet content Presentation of the IRoom controls

Diva Toolkit + IRoom (LIMSI)

Page 39: T7 Embodied conversational agents and affective computing

DIVA 3D-pointing gestures9 loci

Diva Toolkit + IRoom (LIMSI)

Page 40: T7 Embodied conversational agents and affective computing

Pointing objects in the physical world

“Which of the two remote controllers should I use?”

Body zooming for close objects

Diva Toolkit + IRoom (LIMSI)

Page 41: T7 Embodied conversational agents and affective computing

Assistant agents

Page 42: T7 Embodied conversational agents and affective computing

Examples from professional web sites

Question

ECA

EDFSite

P. Suignard, GT ACA, 2004.10.26http://www.limsi.fr/aca/

Lucie, virtual assistant of SFR

Page 43: T7 Embodied conversational agents and affective computing

A cons-example: the « Clippie effect »

Microsoft Agents™― Free technology – no support

― Agents 2D cartoon

― Can be used on web pages

― Third part characters (LaCantoche™)

― API for JavaScript & VBScript

Limitations― Inputs limited to clicks and menus

― No proper dialogue model

― No application model

― No synchronization speech/movements

― Emotions = cartoon ‘emotes’

Intuitive request expressed in NL

2 results too far ranked but quite relevant

Helping text corresponding to

the 1st line of result

Page 44: T7 Embodied conversational agents and affective computing

« An Assistant Agent is a software tool with the capacity to resolve help requests, issuing from novice users, about the static structure and the dynamic functioning of software components or services »

User person with poor knowledge about the component (novice)

Request help demand in natural language (speech/text)

Component computer application, web service, ambient appliance

Agent rational, assistant, conversational, (can be embodied)

Mediator symbolic model of the structure and the functioning

InterViews Project – February 1999Following Patti Maes MIT, 1994

Assistant agents: definition

Page 45: T7 Embodied conversational agents and affective computing

Assistant agent

Novice user

Symbolic model

Software component

The assistant metaphor

Request

???

Page 46: T7 Embodied conversational agents and affective computing

Human Agent

Software Agent

Modeling Agent

Rationall Agent

Linguistic Agent

Assistant ECA

Exemple : un groupe nominal

§Gboxanalysis"le tres petit bouton rouge"GRASP PHASE 1 RULELIST length 74

GRASP PHASE 2 RULELIST length 9

FLEX leLEM lePOS DDKEY THERTYPE LEMID 54252

FLEX tresLEM tresPOS RRKEY INTLARGERTYPE LEMID 54253

FLEX petitLEM petitPOS JJKEY SIZESMALLRTYPE LEMID 54254

FLEX boutonLEM boutonPOS NNKEY BUTTONRTYPE LEMID 54255

FLEX rougeLEM rougePOS JJKEY REDRTYPE LEMID 54256

HIT: RR: R J tres , petitFLEX leLEM lePOS DDKEY THERTYPE LEMID 54252

FLEX petitLEM petitPOS JJKEY SIZESMALLRMODE :

FLEX tresLEM tresPOS RRKEY INTLARGERTYPE LEMID 54253

SETK 1TYPE LEMID 54254

FLEX boutonLEM boutonPOS NNKEY BUTTONRTYPE LEMID54255

FLEX rougeLEM rougePOS JJKEY REDRTYPE LEMID 54256

HIT: NN: D J NE J le, petit , bouton , rougeFLEX boutonLEM boutonPOS NNKEY BUTTONRDET

FLEX leLEM lePOS DDKEY THERTYPE LEMID 54252

ATTR :

FLEX petitLEM petitPOS JJKEY SIZESMALLRMODE :

FLEX tresLEM tresPOS RRKEY INTLARGERTYPE LEMID 54253

SETK 1TYPE LEMID 54254

FLEX rougeLEM rougePOS JJKEY REDRTYPE LEMID 54256

SETK 2TYPE LEMID 54255

Σ Hi

Comment je peux …

Hello !

MODEL[ ID[main] PARTS[ TITLE, FIELD, GROUP[ ID[command], PARTS[ BUTTON, BUTTON, BUTTON ] ] ], USAGE["This

component is …" ]

]

Dialogue Session

Profil

The agents of the Daft project

Page 47: T7 Embodied conversational agents and affective computing

Formal answer language

Interaction model

- Session- Profile

- Task

Rational agent

The agent reasons,

modifies the models

Σ Hi

ASK[COUNT[REF(BUTTON)]]

« How many buttons are there? »

TELL[EQUAL[COUNT[REF(BUTTON)]],3]]

« There are three buttons »

Loop1. Read a formal request ;

2. Contextualize the formal request: - Explore the interaction model; - Explore the component model.

3. Process the formal request : - Update the interaction model; - Update the component model.

4. Produce the formal answer.

Component model

- Structure

- Action- Process

Formal request language

The Rational Assisting Agent

Page 48: T7 Embodied conversational agents and affective computing

Here is …

Standard GUI that can contain Java applets

“Counter” component

“Hanoi” component

“ AMI web site” component

...

LEA

counter

Speech synthesis: Elan SpeechACA LEA (Java)

J-C Martin & S. Abrilian

Java Interface for ECA: DaftLea (K. LeGuern 2004)

Dialogue with components

Page 49: T7 Embodied conversational agents and affective computing

«restart the counter »

When the user enters « restart the counter » in the chatbox, he/she can see:

- The number increments 8, 10, 12, 14 …- LEA says: « Done! »- LEA expresses the emote: ACKNOWLEDGE

Done!

1) The utterance is analyzed to build a formal request: RESTART[REF[“COUNTER”]]2) The reference REF is resolved within the model (large red rectangle)3) The restart action is applied on the boolean of the counter thus producing the model variable INT[8] to be incremented4) An update is sent to the boolean variable of the application5) The behavioral answer «  ACKNOWLEDGE » is sent to the virtual agent.

12

3

4

MEDIATING

MODEL

Processing of a Daft request

Page 50: T7 Embodied conversational agents and affective computing

LIMSI : Diva/DivaLite

Signing agents

Page 51: T7 Embodied conversational agents and affective computing

Usefulness of embodiment

3D realistic agents

2D cartoon

Eyeballs

Humans

Page 52: T7 Embodied conversational agents and affective computing

Efficiency Measures the actual performance of the couple user-agent when carrying out a given task.

Usability Measures the capacity of the user to have a good comprehension of the functioning of the system and the fluency of the control.

Friendliness Measures the “feeling” of the user about the features of the system (attraction, commitment, esthetic, comfort, …)

Believability Measures the “feeling” of the user about the fact that the agent can understand the user’s problems and has the capacity to help with real competence.

Trust Measures the “feeling” of the user about the fact that the agent behaves as a trustable and cooperative entity.

Ob

ject

ive

Su

bje

ctiv

e

Evaluation of ECAs

Page 53: T7 Embodied conversational agents and affective computing

% F

rien

d lin

ess

% similarity

Toy"uncanny

valley"

Bunraku

puppet

Masahito Mori, Institut Robotique de Tokyo in Jasia Reicardt, “Robots are coming”, Thames and Hudson Ltd, 1978

3D realistic agents2D cartoonEyeballs Humans

Ratio realism / friendliness

Page 54: T7 Embodied conversational agents and affective computing

WITH AGENT WITHOUT AGENT

30 Sujects : 15 M & 15 F,

age ≈ 28 years

PresentationWith Agent

PresentationWithout Agent

Measured improving of the parameters:- performance of comprehension- performance of recall (memory)- subjective questions satisfaction

• Lester et al. (1997). The Persona Effect: Affective impact of Animated Pedagogical Agents. CHI’97. • van Mulken et al. (1998). The Persona Effect: How substantial is it? HCI’98.

Results

The Persona Effect of Lester

ECAAn arrow

Page 55: T7 Embodied conversational agents and affective computing

61 Sujects :39 M, 22 F

Measured improving the performance of recall of stories

Subjective evaluation:Anthropomorphic questions

Example : “Did you feel the agent sensitive?”

Experiment: stories are told to subjects using three various situations (without agent, realistic agent, cartoon agent). Recalling questions are then asked to subjects. The three situations are compared.

Result: an ECA improve the memorizing of information.

Ik ben Annita, de assisterent bj dit experiment

Realistic agent

Without agent

Cartoon agent

Beun et al. (2003). Embodied conversational agents: Effects on memory performance and anthropomorphisation. IVA’03.

Persona effect and memorizing

Page 56: T7 Embodied conversational agents and affective computing

Experiments on Functional description

Three different agents: ― 2 men (Marco, Jules) with different garment― 1 woman (Lea)

Three strategies of cooperation: ― Redundancy ― Complementarity ― Specialization

Three objects to describe:1. Video recorder remote controller 2. Photocopier control panel3. Application for designing graphic documents

33 = 27 experiments

1

23

Stéphanie Buisine, LIMSI 2004

Page 57: T7 Embodied conversational agents and affective computing

Functional description: evaluation results

Quality of explanation―No noticeable effect of the appearance,―Noticeable effect of the multimodal behaviour:

the redundant explanations are preferred.

Sympathy―No noticeable effect of the multimodal behaviour,―Noticeable effect of the appearance :

Performance―Same as the appearance,―But no correlation between Sympathy and

Performance.

Impact of the user’s gender―Men react differentially and prefer redounding

explanations,―Women do not react differentially.

> >

Stéphanie Buisine, LIMSI 2004

Page 58: T7 Embodied conversational agents and affective computing

Human-ECA Interaction

Page 59: T7 Embodied conversational agents and affective computing

Turing test: conversational intelligence

Turing Test, Alan M. Turing (1912-1954) : "the imitation game" in “Computing Machinery and Intelligence”, Mind, Vol. 59, No. 236, pp. 433-460, 1950.

Can be replaced by a Turing-like test, on output data instead of during direct interaction

Page 60: T7 Embodied conversational agents and affective computing

Wizard of Oz methodology

Wizard of Oz An experimenter pilots an avatar The user ignores is in front of a human (he thinks

he faces a virtual agent)

Page 61: T7 Embodied conversational agents and affective computing

Humain-Agent interaction

Interaction

InteractionInteraction

Multiple inputsTextual (text boxes), audio (microphone), video (camera), sensors (AmI environment),...

Multiple outputsTextual, audio, facial expressions, postures, gestures, lighting, ...

Multiple modelsConversational, emotional, of the environment, of the user, ...

Page 62: T7 Embodied conversational agents and affective computing

European project SEMAINE:an ECA architecture

http://www.semaine-project.eu/

Page 63: T7 Embodied conversational agents and affective computing

Example of ECA architecture

http://semaine.opendfki.de/wiki/ComponentArchitecture

Semaine Project

Spike

Page 64: T7 Embodied conversational agents and affective computing

Interaction inputs

Camera

Microphone

Sensors

Text box

Microphone + transcript / text box: dialogue input (semantic)

Microphone (emotion) Camera:

Gestures (semantic) Facial expressions / postures (emotions)

Sensors: Context Physiological signals (emotions)

Page 65: T7 Embodied conversational agents and affective computing

Human-ECA Dialogue

S. Pesty, GT ACA, 2005.11.15, http://www.limsi.fr/aca/

Page 66: T7 Embodied conversational agents and affective computing

Textual / GUI Embodied

Non linguistic • Contextual help: pop-ups, ...• Online manuals: hypertext navigation• Assistants using a decision tree

• State printer: emotion, emotes

Unidirectional, linguistic

INPUT• Query fields embedded in an application window: keywords, free/open questions

SORTIE• Feature presenter

Conversational (bidirectional, linguistic)

• Chat bots: Eliza• Dialogical assistants:

InterViews, Daft, Artemis, …

• Mediators: acts as a representative of an application of a service• Avatars

Conversational agents

Page 67: T7 Embodied conversational agents and affective computing

The reference of the chatbotsA. L. I. C. E.

Author: Richard S. Wallace Website: http://www.alicebot.orgDocumentation : http://alicebot.org/TR/2001/WD-aiml-1.0.1-20011025-006.html

Page 68: T7 Embodied conversational agents and affective computing

Jabberwacky wins the Loebner Prize 2005

….PROGRAM: Hello, my name is George and I am the human.JUDGE: Hi George, cslkjslw weicmslw weos, right?PROGRAM: Don't repeat your name, csthiang!JUDGE: If lsjkel lwehlx wouldn't you say?PROGRAM: No, he's still good.JUDGE: Who.PROGRAM: You and me.JUDGE: If you are human, tell me which is larger, a salt crystal or the Empire State Building.PROGRAM: It's caring about another person's well being more than your own.JUDGE: What if I care about another person's well being more than I should?PROGRAM: How would you do that?JUDGE: Which is further away from us, Queens or Inner Mongolia?PROGRAM: HA! HA! That was stupid. Tell me a GOOD joke this time.JUDGE: You say you are human but you can't answer a simple question.PROGRAM: Because I am alive.JUDGE: You are not following the rules. You are supposed to convince me that you are a human, not convince me that you are mysterious.PROGRAM: I can't very well convince one lacking a mind, can I?JUDGE: It isn't my mind that is in question, it is yours!PROGRAM: No my name is not katie.JUDGE: Which is larger, your brain or Belgium?PPROGRAM Oh the one on the far left without a doubt.JUDGE: Francais.ROGRAM: Yes do you speak it?JUDGE: Francais.PROGRAM: Francais?JUDGE: Francais.PROGRAM: O well I don't speak french.JUDGE: Who.PROGRAM: Mine.JUDGE: The ljwli is what we want, right? PROGRAM: I can not do it right now because I m talking to you a million miles away.JUDGE: What do you think the ljwli.PROGRAM: Depends on the person.JUDGE: But the ljwli.PROGRAM: So they can chuck more wood?…

Page 69: T7 Embodied conversational agents and affective computing

Dialogue systems

Retro-action loop not taken into account: knowing the state of the dialogue (DM) can improve

Automatic speech recognizer

Natural language understanding

The DM should manage multimodality

As input (ex: gestures, facial expressions, emotions)

As output (ex: emphasis, voice modulating, utterance+gesture)

Page 70: T7 Embodied conversational agents and affective computing

Utterance understanding

Speech act theory (Searle&Vanderveken, 1985) In accordance with Bratman's theory of

intention (Bratman, 1987) Illocutionary acts (Vanderveken, 1990) Formalisation: F(P) 5 speech acts: assertive, commissive,

directive, déclarative, expressive Transposition to MAS

KQML FIPA ACL

Page 71: T7 Embodied conversational agents and affective computing

Existing dialogue systems

Case-base systems using keyword spotting: chatbot

Communication protocols (FIPA ACL): MAS

Planning systems (Allen 80, Traum 96)

POM-DP: call centres (Frampton&Lemon 09)

Dialogue games (Maudet 00)

Information-State update (Larsson 00) : Trindikit, GoDiS, IbiS

Scientific deadlock yet unsolved!(see Mission Rehearsal Exercise (Swartout et al. 2006))

Page 72: T7 Embodied conversational agents and affective computing

Ratio Effort/Performance

Effort = Code and resources

Actual Perform

a nce

1 10010 1000

Chat bots

H/M Dialog Systems

10%

50%

100%

Games, socialization, games, affects, …

Control, command, assistance…

Page 73: T7 Embodied conversational agents and affective computing

Affective Computing

Page 74: T7 Embodied conversational agents and affective computing

Kesako ? Affective Computing

● Book by R. Picard (1997)● Detect, Interprete, Process and Simulate

human emotions

How is it conncted to ECAs ?● (Krämer et al., 2003)

People, when interacting with an ECA, tend to be more polite, more nervous and behave socially

● (Hess et al., 1999)Emotions play a crucial rôle in social interactions

→ Affective Conversationnal Agents !

Page 75: T7 Embodied conversational agents and affective computing

The story so far...

H-M Dialog

Virtual Agents

Eliza Trains

Online ECAseverywhere

70 90 10

Emotions

Video Games Serious GamesSimulation

R. Picard Humaine

IVA

ACII

Agents MAS

CHI

AAMAS

Page 76: T7 Embodied conversational agents and affective computing

Links from the community

IVA (Intelligent Virtual Agents)

http://iva2012.soe.ucsc.edu/

ACII (Affective Computing & Intelligent Interaction)

http://www.acii2011.org/

Humaine● Projet FP6 (1/12004 – 31/12/2007)● Association● http://emotion-research.net/

Page 77: T7 Embodied conversational agents and affective computing

HUMAINE

http://emotion-research.net

Page 78: T7 Embodied conversational agents and affective computing

Research questions

Detect, interprete, process and simulate human emotions → affects

Cognitive Sciences● AI and psychology (for emotions)● AI and sociology (for social relations)

3 domains :

Recognition Models Synthesis

Patterns Reco AI Image

Signal processingFusionMachine Learning

Formal Models of InteractionsKR & reasoningBehaviour models

SynchronisationTurn taking

Page 79: T7 Embodied conversational agents and affective computing

Examples

Recognition● Affectiva Q sensor :

http://www.youtube.com/watch?v=1fW3hEvxGUU

Synthesis● Agent :

http://www.youtube.com/watch?v=CiuoBiJjGG4

● Robot (Hanson) : http://www.youtube.com/watch?v=pkpWCu1k0ZI

Models● Fatima (LIREC) :

http://www.youtube.com/watch?v=ZiPlE80Xiv4

Page 80: T7 Embodied conversational agents and affective computing

Emotion recognition General principle

● Corpus tagging● Supervised learning (classifier)

Corpus(images, films...)

Annotations

E.g. Anvil

Source

Taggedcorpus

Emotion categories

SupervisedLearning

Decisiontree

classes

criterions

E.g. ID3

Durations, Pitches,Frequencies, Colors,

Etc...

Tags

Extraction

Emotion

Page 81: T7 Embodied conversational agents and affective computing

Multi-modal behaviour annotation: ANVIL

Outil ANVIL

Page 82: T7 Embodied conversational agents and affective computing

Emotions synthesis

From Emotion to Parameters

Evaluation● Human recognition of expressed emotions

not reliable, even in H-H interaction (with spontaneous emotions)

Some examples:● Virtual agents: Greta/Semaine● Robots: Kismet, iCat, Nao

Emotions

parametrizationExpressing

AnnotatedCorpus

State transitionsDynamicsGeneralization

Page 83: T7 Embodied conversational agents and affective computing

From annotations to animations

OriginalOriginalvideovideo

AngerAnger DesperationDesperation

C. Pélachaud LTCI and J-C Martin LIMSI

GRETAGRETA

JoyJoie

AngerColère

FearPeur

SurpriseSurprise

DistressTristesse

Page 84: T7 Embodied conversational agents and affective computing

Affective Models

Page 85: T7 Embodied conversational agents and affective computing

A bit of Human Science

Darwin, 1872 (Ed. 2001)The expression of emotions in man and animals

→Adaptation mechanism

+ Role in communication (Ekman, 1973)

2 schools● James, 1887

Organism changes → emotion● Lazarus, 1984 ; Scherer, 1984

Appraisal theory

« The question is not whether intelligent machines can have any emotion, but whether machines can be intelligent without any emotions. »

Marvin Minsky, 1986

Page 86: T7 Embodied conversational agents and affective computing

Appraisal theory

Environment

Individual

Perceptions= stimuli

Goals, preferencesPersonnality

Emotion

Adaptation

aka coping

Actions

Evaluation in context

Page 87: T7 Embodied conversational agents and affective computing

Appraisal theory (2)

Coping (adaptation strategy)= the process that one puts between

himself/herself and the punishing event● Active coping (i.e. problem focused)

→ act upon one's environment● Passive coping (i.e. emotion focused)

→ act upon one's emotion

Rousseau & Hayes-Roth, 1997● Personnality → emotions (Watson & Clark, 1992)

● Emotions → Social relations (Walker, 1997)

Page 88: T7 Embodied conversational agents and affective computing

Architecture

Events

Emotions

Personnality

Socialesrelations

Actions

Social role

Mood

Page 89: T7 Embodied conversational agents and affective computing

Architecture

Events

Emotions

Personnality

Socialesrelations

Actions

Social role

Mood

1

2

3

Page 90: T7 Embodied conversational agents and affective computing

Models of personnality

Personnality = stable component in an individual's behaviour, attitude, reactions...

Computational models● Define variables● Define their value domains● Define the characteristic values

Eysenck, 1967● Extraversion → Higher sensitivity to positive

emotions (and more expressive)● Neuroticism → Higher sensitivity to negative

emotions (and more frequent changes)

Page 91: T7 Embodied conversational agents and affective computing

Models of personnality (2) McRae, 1987 (a.k.a OCEAN or « Big Five Model »)

● Openness → curiosity● Conciousness → organization● Extraversion → exteriorization● Agreeableness → cooperation● Neuroticism → emotional unstability

Myer-Briggs Type Indicator (MBTI) based on Jung

● Energy → introverted or extroverted● Information collection → sensitive or intuitive● Decision → thinking or feeling● Action → judgement or perception

Page 92: T7 Embodied conversational agents and affective computing

One example: ALMA(Gebhard, 2005)

OCEAN

Events

Emotions

Mood

Page 93: T7 Embodied conversational agents and affective computing

Emotions models

What is an emotion ?● Darwin, James, Ekman, Scherer, …

Characterizing an emotion● Categories-based approaches

→ There are N emotions (N being subject to discussion)

● Dimensions-based approaches→ An emotion is a point in an N-space (dimensions

must make a basis)

Page 94: T7 Embodied conversational agents and affective computing

The PAD model

Mehrabian, 1980 Emotion = 3 dimensions

● Pleasure (or « valence »)● Arrousal (or « activation degree »)● Dominance

Examples

Page 95: T7 Embodied conversational agents and affective computing

PAD (cont.)

Pluchnick's wheel

Page 96: T7 Embodied conversational agents and affective computing

The OCC model

Ortony, Clore, Collins, 1988 20 emotion categories

● Joy – Distress● Fear – Hope● Relief – Disappointment, Fear-confirmed –

satisfaction● Pride – Shame, Admiration – Reproach● Love – Hate● Happy-for – Resentment, Gloating – Pity● Remoarse – Gratification, Gratitude – Anger

Page 97: T7 Embodied conversational agents and affective computing

OCC (cont.)

Appraisal model based on individuals' goals & preferences:

Page 98: T7 Embodied conversational agents and affective computing

One example: EMA

EMotions & Adaptation

→ Appraisal

→ Coping (action selection)

Based on SOAR Evaluation criteria

● Relevance● Point of view● Desirability (expected utility)● Probability & expectedness● Cause● Control (by me and others)

(Gratch & Marsella, 2010)

Page 99: T7 Embodied conversational agents and affective computing

EMA (cont.)

Rule-based systemE.g. : Desirability(self, p) > 0 & Likelihood(self, p) < 1.0 → hope

Coping strategies● Perceptions● Belief revision (including about other agents'

intentions and responsibilities)

● Changing goals

Simulation model Application to serious

games

Page 100: T7 Embodied conversational agents and affective computing

Social relations

Social behaviour model

Static models: Walker, Rousseau et al., Gratch...

Dynamic model: Ochs et al.

Several traits:● Liking● Dominance● Solidarity (or social distance)● Familiarity (subject to discussion)

→ Social Relation unidirectional and not-always reciprocal!

Page 101: T7 Embodied conversational agents and affective computing

Emotions and Social Relations

Influence E → RS (examples)

● Ortony, 91 : caused >0 emotions → + liking● Pride, Reproach → + dominance

Fear, Distress, Admiration → - dominance

Influence of the social relation on action and communication

Action selection guided by target social relation Emotional contagion

→ neighbours computation based on social relations

Page 102: T7 Embodied conversational agents and affective computing

One example : OSSE(Ochs et al., 2008)

Scenario= set of events

Attitudes

Events→ Emotions→ Social relationdynamics

Social roles→ initial relation

Page 103: T7 Embodied conversational agents and affective computing

Learn more !

Books● Rosalind Picard, Affective Computing, MIT

Press● Scherer, Bänziger, & Roesch (Eds.) A blueprint

for an affectively competent agent: Cross-fertilization between Emotion Psychology, Affective Neuroscience, and Affective Computing. Oxford University Press

Next IVA conference → next September

humaine-news mailing list !

ACII in 2013

Page 104: T7 Embodied conversational agents and affective computing

Acknowledgements

Jean-Paul Sansonnet (LIMSI) for his presentation materials

All members of GT ACAI

Page 105: T7 Embodied conversational agents and affective computing

Join theGT ACAI!

Conclusion

http://acai.lip6.fr/