Upload
alvin-bishop
View
227
Download
0
Embed Size (px)
Citation preview
Genetic Genetic ProgrammingProgramming
Genetic ProgrammingGenetic Programming
• John Koza, 1992
• Evolve program instead of bitstring
• Lisp program structure is best suited– Genetic operators can do simple replacements of sub-
trees– All generated programs can be treated as legal (no syntax
errors)
Genetic Programming• Specialized form of GA• Manipulates a very specific type of
solution using modified genetic operators• Original application was to design
computer programs• Now applied in alternative areas eg.
Analog Circuits• Does not make distinction between search
and solution space.• Solution represented in very specific
hierarchical manner.
Background/History• By John R. Koza, Stanford University.• 1992, Genetic Programming Treatise -
“Genetic Programming. On the Programming of Computers by Means of Natural Selection.” - Origin of GP.
• Combining the idea of machine learning and evolved tree structures.
Why Genetic Programming?
• It saves time by freeing the human from having to design complex algorithms.
• Not only designing the algorithms but creating ones that give optimal solutions.
• Again, Artificial Intelligence.
What Constitutes a Genetic Program?
• Starts with "What needs to be done"• Agent figures out "How to do it"• Produces a computer program - “Breeding Programs”
• Fitness Test• Code reuse• Architecture Design - Hierarchies• Produce results that are competitive with human
produced results
How are Genetic Principles Applied?
• “Breeding” computer programs. • Crossovers.• Mutations.• Fitness testing.
Computer Programs as Trees• Infix/Postfix• (2 + a)*(4 - num) *
+ -
2 a 4 num
“Breeding” Computer Programs
Hmm hmm heh.Hey butthead. Do
computer programs actually score?
“Breeding” Computer Programs• Start off with a large “pool” of random
computer programs.• Need a way of coming up with the best solution
to the problem using the programs in the “pool”• Based on the definition of the problem and
criteria specified in the fitness test, mutations and crossovers are used to come up with new programs which will solve the problem.
MutationsMutations
Mutations in Nature• Ultimate source of genetic variation.• Radiation, chemicals change genetic
information.• Causes new genes to be created.• One chromosome.• Asexual.• Very rare.
Before:
acgtactggctaa
After:
acatactggctaa
Properties of mutationsProperties of mutations
Genetic Programming• 1. Randomly generate a combinatorial set of computer programs.
• 22. Perform the following steps iteratively until a termination criterion is satisfied– a. Execute each program and assign a fitness value to each individual.– b. Create a new population with the following steps:
• i. Reproduction: Copy the selected program unchanged to the new population.• ii. Crossover: Create a new program by recombining two selected programs at a
random crossover point.• iii. Mutation: Create a new program by randomly changing a selected program.
• 33.. The best sets of individuals are deemed the optimal solution upon termination
Mutations in ProgramsMutations in Programs• Single parental program is probabilistically selected from the
population based on fitness. • MutationMutation point randomly chosen.
– the subtree rooted at that point is deleted, and – a new subtree is grown there using the same random growth process
that was used to generate the initial population.
• Asexual operations (mutation) are typically performed sparingly: – with a low probability of mutations,– probabilistically selected from the population based on fitness.
Crossovers in Crossovers in ProgramsPrograms
Crossovers in Programs1. Two parental programs are selected from the population based on fitness.
2. A crossover point is randomly chosen in the first and second parent. 1. The first parent is called receiving2. The second parent is called contributing
3. The subtree rooted at the crossover point of the first parent is deleted
4. It is replaced by the subtree from the second parent.
5.5. CrossoverCrossover is the predominant operation in genetic programming (and genetic algorithm) research
6. It is performed with a high probability (say, 85% to 90%).
Using Trees To Represent Computer Programs
(+ 2 3 (* X 7) (/ Y 5))
+
2 3 * /
X 7 5Y
Functions
Terminals
Randomly Generating Programs
• Randomly generate a program that takes two arguments and uses basic arithmetic to return an answer– Function set = {+, -, *, /}– Terminal set = {integers, X, Y}
• Randomly select either a function or a terminal to represent our program
• If a function was selected, recursively generate random programs to act as arguments
Randomly Generating Programs
(+ …)
+
Randomly Generating Programs
(+ 2 …)
+
2
Randomly Generating Programs
(+ 2 3 …)
+
2 3
Randomly Generating Programs
(+ 2 3 (* …) …)
+
2 3 *
Randomly Generating Programs
(+ 2 3 (* X 7) (/ …))
+
2 3 * /
X 7
Randomly Generating Programs
(+ 2 3 (* X 7) (/ Y 5))
+
2 3 * /
X 7 5Y
Mutation
(+ 2 3 (* X 7) (/ Y 5))
+
2 3 * /
X 7 5Y
Mutation
(+ 2 3 (* X 7) (/ Y 5))
+
2 3 /
5Y
*
X 7
First pick a random node
Mutation
(+ 2 3 (+ (* 4 2) 3) (/ Y 5))
+
2 3 + /
* 3 5Y
Delete the node and its children, and replace with a randomly generated program
24
Crossover
(+ X (* 3 Y))
+
X *
3 Y
(- (/ 25 X) 7)
-
/ 7
25 X
Crossover
(+ X (* 3 Y))
+
X *
Pick a random node in each program
3 Y
(- (/ 25 X) 7)
-
/ 7
25 X
Crossover
(+ X (* (/ 25 X) Y))
+
X *
Swap the two nodes
3
Y
(- 3 7)
-
/
7
25 X
What About Just RandomlyGenerating Programs?
• Is Genetic Programming really better than just randomly creating new functions?
• Yes!– Pete Angeline compared the result of evolving a tic-tac-toe algorithm
for 200 generations, with a population size of 1000 per generation, against 200,000 randomly generated algorithms
– The best evolved program was found to be significantly superior to the best randomly generated program [Genetic Programming FAQ, 2002]
• The key lies in using a fitness measure to determine which functions survive to reproduce in each generation
Applications of GP in robotics1. Wall-following robot – Koza
– Behaviors of subsumption architecture of Brooks. Evolved a new behavior.
2. Box-moving robot – Mahadevan3. Evolving behavior primitives and arbitrators
– for subsumption architecture4. Motion planning for hexapod – Fukuda, Hoshino, Levy PSU.5. Evolving communication agents Iba, Ueda.6. Mobile robot motion control – Walker.
– for object tracking7. Soccer8. Car racing
Population sizes from 100 to 2000
Real World Applications• Lockheed Martin Missiles and Space Co. - Near-Minimum-
Time Spacecraft Maneuvers [Howley, 96]• GP applied to the problem of rest-to-rest reorientation
maneuvers for satellites• Optimal time solution is a vector of nonlinear differential
equations, which are difficult to solve• An approximate solution is necessary for a real-time
controller• Results: Rest-to-Rest Maneuver Times (8 test cases)
– Optimal Solution: 287.93 seconds– Expert Solution: 300.3 seconds– GP Solution: 292.8 seconds
Real World Applications
• Symbolic Regression• Problem: Given a set of data points, find a
mathematical model
http://alphard.ethz.ch/gerber/approx/default.html
Real World Applications
• Neural Network Optimization [Zhang, Mühlenbein, 1993]
• Image Analysis [Poli, 1996a]• Generation of a knowledge base for expert systems
[Bojarczuk, Lopes, Freitas, 2000]• Fuzzy Logic Control [Akbarzadeh, Kumbla, Tunstel,
Jamshidi, 2000]• Hardware Evolution (Field-Programmable Gate
Array) [Thompson, 1997]
What’s Next?
• Parallel Distributed Genetic Programming [Poli, 1996b]– Operates on graphs rather than parse trees
• Finite State Automata• Asymetric Recurrent Neural Networks [Pujol,
Poli, 1997]
References• Genetic Programming FAQ, 2002. http://www.cs.ucl.ac.uk/research/genprog/gp2faq/gp2faq.html• Akbarzadeh, M.R., Kumbla, K., Tunstel, E., Jarnshidi, M., “Soft Comuting for Autonomous Robotic
Systems”, Computers and Electrivcal Engineering, 2002: 26, pp. 5-32.• Bojarczuk, C.C., Lopes, H.S., Freitas, A.A., “Genetic Programming for Knowledge Discovery In Chest-Pain
Diagnosis”, IEEE Engineering in Medicine and Biology Magazine, 2000: 19, v. 4, pp. 38-44.• Howley, B., “Genetic Programming of Near-Minimum-Time Spacecraft Attitude Maneuvers”, Proceedings
of Genetic Programming 1996, Koza, J.R. et al. (Eds), MIT Press, 1996, pp. 98-109.• Koza, J., “Genetic Programming as a Means for Programming Computers by Natural Selection“, 1994.• Poli, R., “Genetic Programming for Image Analysis”, Proceedings of Genetic Programming 1996, Koza, J.R.
et al. (Eds), MIT Press, 1996, pp. 363-368.• Poli, R., “Parallel Distributed Genetic Programming”, Technical report, The University of Birmingham,
School of Computer Science, 1996.• Pujol, J.C.F., Poli, R., “Efficient Evolution of Asymetric Recurrent Neural Networks Using a Two-dimensional
Representation”, 1997.• Thompson, A., “Artificial Evolution in the Physical World”, Evolutionary Robotics: From Intelligent Robots
to Artificial Life, Gomi, T. (Ed.), AAI Books, 1997, pp. 101-125.• Zhang, B.T., Mühlenbein, H., “Genetic Programming of Minimal Neural Nets Using Occam’s Razor”, Proc.
Of 5th Int. Conf. On Genetic Algorithms, 1993, pp. 342-349.
Subset of LISP for Subset of LISP for Genetic Genetic
ProgrammingProgramming
This is a very very small subset of This is a very very small subset of Lisp Lisp
LISP = Famous first language of Artificial LISP = Famous first language of Artificial Intelligence and RoboticsIntelligence and Robotics
• More functions• Atom, list, cons, car, cdr, numberp,
arithmetic, relations. Cond.• Copying example
A very important concept – Lisp does not distinguish data and programs
Programs in Lisp are trees that are evaluated (calculated)
Tree-reduction semantics
Lisp allows to define special languages in itself
Robot-Lisp cont
• This subset of Lisp was defined for a mobile robot• You can define similar subsets for a humanoid robot, robot
hand, robot head or robot theatre
As an exercise, you can think about behaviors of all Braitenberg Vehicles described by such programs
• You can define much more sophisticated mutations that are based on constraints and in general on your knowledge and good guesses (heuristics)
1. You can define your own languages in Lisp
2. You can write your own Lisp-like language interpreter in C++
3. Many on Web
EyeSim simulatorEyeSim simulator allows to simulate robots, environments allows to simulate robots, environments and learning strategies together, with no real robotand learning strategies together, with no real robot
Evolution here takes feedback from simulated environment (simulated)
In our project with teens the feedback is from humans who evaluate the robot theatre performance (subjective)
In our previous project with hexapod the feedback was from real measurement in environment (objective)
Evolution principles are the same for all evolutionary algorithms.
• Each individual (Lisp program) is executed on the EyeSim simulator for a limited number of program steps.
• The program performance is rated continually or when finished.
Contents
51
• Introduction• Affective Computing Research• Affection Detection and Recognition• Applications • Future Research Directions• Ideas • Issues• Conclusion
Current status• Web is developed for traditional data and computer
I/O: text, keyboard, mouse
• This is simple and effective but not a natural way of human interaction with the world
• Humans interact via perceptual system
Human Perceptual System
• Human perceptual system has multiple senses: visual, acoustical, haptic
(touch, body position, temperature) and actuators (vocal tract, muscles, motoric
system)• The perceptual system is intrinsically
MULTIMODAL: multiple senses and actuators operate in perfectly coordinated way
Perceptual Information Technology
• Information technology is evolving towards natural MULTIMODAL human interaction:
Touch gestures revolutionized mobile devices Intelligent speech input
is available There is more to come: new sensors, cameras and intelligence
Signal Processing Role• Perceptual Information Technology requires
sophisticated signal processing and it is hard due to: - Complex input signals - Complex information encoding - Complex databases of knowledge• Highly sophisticated algorithms and huge processing power are required
Multimodal Web• The trend towards perceptual information is
noted at the W3C: Extending the Web to allow multiple modes of interaction: GUI,
Speech, Vision, Pen, Gestures, Haptic interfaces, ...
• Multimodal Interaction Activity: - Multimodal Architecture and Interfaces - EMMA - InkML - EmotionML
Multimodal Architecture
EMMA
• Extensible Multimodal Markup Language for Annotations
- containing and annotating the interpretation of user input - transcription into words of a raw signal, for instance derived from speech, pen - interpretation is to be generated by signal interpretation processes, such as speech and ink recognition, semantic interpreters
Ink Markup Language• data format for representing ink• input and processing of handwriting, gestures, sketches, music using traces of pen Traceattributes
What is Affective Computing?
60
Dr. Rosalind Picard of MIT Media Laboratory coined the term Affective Computing in 1994 and published
the first book on Affective Computing in 1997.
According to Picard - “…computing that relates to, arises from, or
deliberately influences emotions”
Picard, R. 1995. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical Report Picard, R. 1995. Affective Computing. The MIT Press
61
Affective Computing Motivations and Goals
62
Research shows that human intelligence is not independent of emotion. Emotion and cognitive functions are inextricably integrated into the human brain.
Automatic assessment of human emotional/affective state.
Creating a bridge between highly emotional human and emotionally challenged computer systems/electronic devices - Systems capable of responding emotionally.
The central issues in affective computing are representation, detection, and classification of users emotions.
Norman, D.A. (1981). ‘Twelve issues for cognitive science’Picard, R., & Klein, J. (2002). Computers that recognize and respond to user emotion: Theoretical and practical implications.Taleb, T.; Bottazzi, D.; Nasser, N.; , "A Novel Middleware Solution to Improve Ubiquitous Healthcare Systems Aided by Affective Information,"
Affective Computing Research
63
The research areas of affective computing visualized by MIT (2001)
Affective computing can be related to other computing disciplines such as Artificial Intelligence (AI), Virtual Reality (VR) and Human Computer interaction (HCI).
Questions need to be Answered?
• What is an affective state (typically feelings, moods, sentiments etc.)?• Which human communicative signals convey information about affective state?• How are various kinds of affective information can be combined to optimize inferences about affective states?
• How to apply affective information to designing systems?
M. Pantic, N. Sebe, J. F. Cohn, and T. Huang, 2005. Affective multimodal human-computer interaction. In ACM International Conference on Multimedia (MM) .
Affective Computing ResearchSteps towards affective computing research
64
We first need to define what we mean when we use the word emotion.
Second, we need an emotion model that gives us the possibility to differentiate between emotional states.
In addition, we need a classification scheme that uses specific features from an underlying (input) signal to recognize the user’s emotions .
The emotion model has to fit together with the classification scheme used by the emotion recognizer.
R. Sharma, V. Pavlovic, and T. Huang. Toward multimodal human-computer interface. In Proceedings of the IEEE, 1998.
How Emotion/Affection is Modeled?
65
According to Boehner et al. - In affective computing, affect is often seen as another kind of information – discrete units or states internal to an individual that can be transmitted in a loss-free manner from people to computational systems and back.
Affection description perspectives – Discrete Emotion Description
Happiness, fear, sadness, hostility, guilt, surprise, interest
Dimensional Description Pleasure, arousal, dominance
Boehner, K., DePaula, R., Dourish, P. & Sengers, P. 2005. Affect: From Information to InteractionTaleb, T.; Bottazzi, D.; Nasser, N.; , "A Novel Middleware Solution to Improve Ubiquitous Healthcare Systems Aided by Affective Information,“Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications,“Burkhardt, F.; van Ballegooy, M.; Engelbrecht, K.-P.; Polzehl, T.; Stegmann, J.; , "Emotion detection in dialog systems: Applications, strategies and challenges,”
66
Affection Detection and RecognitionTechniques and Methodologies
67
Affection detection sources:
• Bio-signals (Psychological sensors, Wearable sensors)– Brain Signal, skin temperature, blood pressure,
heart rate, respiration rate• Facial Expression• Speech/Vocal expression• Gesture
– Limbic movements• Text
Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications,“Leon, E.; Clarke, G.; Sepulveda, F.; Callaghan, V., "Optimised attribute selection for emotion classification using physiological signals”
Affection Detection and RecognitionTechniques and Methodologies
68
Affection recognition modalities• Unimodal
– primitive technique
• Multimodal – provide a more natural style for
communication
Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their ApplicationsZhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; , "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions,"
Affection Recognition MethodVoice / Speech
69
Paralinguistic Features of Speech – how is it said? Prosodic features (e.g., pitch-related feature, energy-related features, and
speech rate) Spectral features (e.g., MFCC - Mel-frequency cepstral coefficient and cepstral
features) Spectral tilt, LFPC (Log Frequency Power Coefficients) F0 (fundamental frequency of speech), Long-term spectrum Studies show that pitch and energy contribute the most to affect recognition Speech disfluencies (e.g., filler and silence pauses) Context information (e.g., subject, gender, and turn-level features
representing local and global aspects of the dialogue) Nonlinguistic vocalizations (e.g., laughs and cries, decode other affective
signals such as stress, depression, boredom, and excitement)
Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
Affection Recognition MethodSpeech Recognition Architecture
70
Feature ExtractionPre-processing
Speech Signal
Classification
Classified ResultAudio recordings collected in call centers and, meetings, Wizard of Oz scenarios interviews and other dialogue systems
• Accuracy rates from speech are somewhat lower (35%) than facial expressions for the basic emotions .
• Sadness, anger, and fear are the emotions that are best recognized through voice, while disgust is the worst.
]M. Pantic, N. Sebe, J. F. Cohn, and T. Huang. Affective multimodal human-computer interaction. In ACM International Conference on Multimedia (MM), 2005.Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
Affection Recognition MethodFacial Expression
71
[25]
[27]
Affection Recognition MethodFacial Expression
74
Example: Active Appearance Model (AAM) [26]
(AAM) based system which uses AAMs to track the faceand extract visual features. Support vector machines are used(SVMs) to classify the facial expressions and emotions.
Affection Recognition MethodPsychological/Bio-Signals Signals
76
• Physiological signals derived from Autonomic Nervous System (ANS) of human body.
– Fear for example increases heartbeat and respiration rates, causes palm sweating, etc. [8]
• Psychological Metrics used are [23]:
– GSR - Galvanic Skin Resistance
– RESSP - Respiration
– BVP - Blood Pressure
– Skin Temperature
• Electroencephalogram (EEG), Electrocardiography (ECG), Electrodermal activity (EDA)], Electromyogram (EMG) [8][9][23]
• Skin conductivity sensors, blood volume sensors, and respiration sensors may be integrated with shoes, earrings or watches, and T-shirts [8] [9]
[24]
Affection Recognition MethodGesture / Body Motion
78
Pantic et al.’s survey shows that gesture and body motion information is an important modality for human affect recognition. Combination of face and gesture is 35% more accurate than facial expression alone [21].
Two categories of Body Motion based affect recognition [22] Stylized
The entirety of the movement encodes a particular emotion.
Non-stylized More natural - knocking door, lifting
hand, walking etc.
Example: Applying SOSPDF (shape of signal probability density function) feature description framework in captured 3D human motion data [22]
Frequently used Modeling Techniques
79
• Fuzzy Logic
• Neural Networks (NN)
• Hybrid: Fuzzy + NN
• Tree augmented Naïve Bayes
• Hidden Markov Models (HMM)
• K-Nearest Neighbors (KNN)
• Linear Discriminant Analysis (LDA)
• Support Vector Machines (SVM)
• Gaussian Mixture Models (GMM)
• Discriminant Function Analysis (DFA)
• Sequential Forward Floating Search (SFFS)
Emotion Markup Language• Annotation of material involving emotionality• Automatic recognition of emotions from sensors• Generation of emotion-related system responses: speech,
music, colors, gestures, synthetic faces• Emotion vocabularies and representations:
<emotion category- set="http://www.w3.org/TR/emotion- voc/xml#big6"> <category name="surprise" confidence="0.9 </emotion>
Emotion RepresentationComputing and Communication
81W3C standard for emotion representation Emotion Markup Language (EmotionML) 1.0 [20]
Applications
83
• In the security sector affective behavioural cues play a crucial role in establishing or detracting from credibility
• In the medical sector, affective behavioural cues are a direct means to identify when specific mental processes are occurring
• Neurology (in studies on dependence between emotion dysfunction or impairment and brain lesions)
• Psychiatry (in studies on schizophrenia and mood disorders)• Dialog/Automatic call center Environment – to reduce
user/customer frustration• Academic learning• Human Computer Interaction (HCI)
Zhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; , "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," Pattern Analysis and Machine Intelligence
Future Research Directions
84
• So far Context has been overlooked in most Affection Computing researches
• Collaboration among Affection researchers from different disciplines
• Fast real-time processing• Multimodal detection and recognition to achieve
higher accuracy• On/Off focus• Systems that can model conscious and subconscious
user behaviour
Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
Context Aware Multimodal Affection Analysis Based Smart Learning Environment
85
86Application System Architecture
Face
Analysis
Voice
Analysis
Posture
Analysis
Physiology
Analysis
System Controller
Hardw
are Calibration M
anager
Decision Support System
Multimedia Note
Reading Behavior Report
Lesson Length Suggestion
Class Efficiency Report
Multimodal Affect Input Parameter Adjustment
Output
87
Driver Emotion Aware Multiple Agent Controlled Automatic Vehicle
8888
Navigation Agent
Safety Agent
Driving Aid Agent
Affective Multimedia Agent
AudioLinguistic / Non-linguistic
Facial Expression
Seat Pressure
Actions •Steering Movement•Interaction with Gas / Break Paddle
Bio-signals
Stress LevelBasic Emotions
Feature Detector
Feature Detector
Feature Detector
………….
..
Feature Estimator
Complex Emotions
……Route Selection
Inter agent communication to aid decision making
Notify in case of Emergency
Speed, ABS,Traction Control
Music, ClimateControl
Alert the Driver
Affective ComputingConcerned Issues
89
Privacy concerns [4] [5] I do not want the outside world to know
what goes through my mind…Twitter is the limit
Ethical concerns [5] Robot nurse or caregivers capable of
effective feedback Risk of misuse of the technology
In the hand of impostors Computers start to make emotionally
distorted, harmful decisions [18] Complex technology
Effectiveness is still questionable, risk of false interpretation
Conclusion
90
Strategic Business Insight (SBI) –
“Ultimately, affective-computing technology could eliminate the need for devices that today stymie and frustrate users…
Affective computing is an important development in computing, because as pervasive or ubiquitous computing becomes mainstream, computers will be far more invisible and natural in their interactions with humans.” [4]
Toyota’s thought controlled wheelchair [19]
91Picard, R. 1995. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical Report Picard, R. 1995. Affective Computing. The MIT Press
92
References
93
[1] Picard, R. 1995. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 321[2] Picard, R. 1995. Affective Computing. The MIT Press. ISBN-10: 0-262-66115-2.[3] Picard, R., & Klein, J. (2002). Computers that recognize and respond to user emotion: Theoretical and practical implications. Interacting
With Computers, 14, 141-169. [4] http://www.sric-bi.com/[5] Bullington, J. 2005. ‘Affective’ computing and emotion recognition systems: The future of biometric surveillance? Information Security
Curriculum Development (InfoSecCD) Conference '05, September 23-24, 2005, Kennesaw, GA, USA. [6] Boehner, K., DePaula, R., Dourish, P. & Sengers, P. 2005. Affect: From Information to Interaction. AARHUS’05 8/21-8/25/05 Århus,
Denmark.[7] Zeng, Z. et al. 2004. Bimodal HCI-related Affect Recognition. ICMI’04, October 13–15, 2004, State College, Pennsylvania, USA.[8] Taleb, T.; Bottazzi, D.; Nasser, N.; , "A Novel Middleware Solution to Improve Ubiquitous Healthcare Systems Aided by Affective
Information," Information Technology in Biomedicine, IEEE Transactions on , vol.14, no.2, pp.335-349, March 2010[9] Khosrowabadi, R. et al. 2010. EEG-based emotion recognition using self-organizing map for boundary detection. International
Conference on Pattern Recognition, 2010.[10] R. Cowie, E. Douglas, N. Tsapatsoulis, G. Vostis, S. Kollias, w. Fellenz and J. G. Taylor, Emotion Recognition in Human-computer
Interaction. In: IEEE Signal Processing Magazine, Band 18 p.32 - 80, 2001.[11] Rafael A. Calvo, Sidney D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications," IEEE
Transactions on Affective Computing, pp. 18-37, January-June, 2010 [12] Zhihong Zeng; Pantic, M.; Roisman, G.I.; Huang, T.S.; , "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous
Expressions," Pattern Analysis and Machine Intelligence, IEEE Transactions on , vol.31, no.1, pp.39-58, Jan. 2009[13] Norman, D.A. (1981). ‘Twelve issues for cognitive science’, Perspectives on Cognitive Science, Hillsdale, NJ: Erlbaum, pp.265–295.[14] R. Sharma, V. Pavlovic, and T. Huang. Toward multimodal human-computer interface. In Proceedings of the IEEE, 1998.[15] Vesterinen, E. (2001). Affective Computing. Digital media research seminar, spring 2001: “Space Odyssey 2001”.
References[16] Burkhardt, F.; van Ballegooy, M.; Engelbrecht, K.-P.; Polzehl, T.; Stegmann, J.; , "Emotion detection in dialog systems:
Applications, strategies and challenges," Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on , vol., no., pp.1-6, 10-12 Sept. 2009
[17] Leon, E.; Clarke, G.; Sepulveda, F.; Callaghan, V.; , "Optimised attribute selection for emotion classification using physiological signals," Engineering in Medicine and Biology Society, 2004. IEMBS '04. 26th Annual International Conference of the IEEE , vol.1, no., pp.184-187, 1-5 Sept. 2004
[19] http://www.engadget.com/2009/06/30/toyotas-mind-controlled-wheelchair-boast-fastest-brainwave-anal/[20] http://www.w3.org/TR/2009/WD-emotionml-20091029/[21] M. Pantic, N. Sebe, J. F. Cohn, and T. Huang. Affective multimodal human-computer interaction. In ACM International Conference
on Multimedia (MM), 2005.[22] Gong, L., Wang, T., Wang, C., Liu, F., Zhang, F., and Yu, X. 2010. Recognizing affect from non-stylized body motion using shape of
Gaussian descriptors. In Proceedings of the 2010 ACM Symposium on Applied Computing (Sierre, Switzerland, March 22 - 26, 2010). SAC '10. ACM, New York, NY, 1203-1206.
[23] Khalili, Z.; Moradi, M.H.; , "Emotion recognition system using brain and peripheral signals: Using correlation dimension to improve the results of EEG," Neural Networks, 2009. IJCNN 2009. International Joint Conference on , vol., no., pp.1571-1575, 14-19 June 2009
[24] Huaming Li and Jindong Tan. 2007. Heartbeat driven medium access control for body sensor networks. In Proceedings of the 1st ACM SIGMOBILE international workshop on Systems and networking support for healthcare and assisted living environments (HealthNet '07). ACM, New York, NY, USA, 25-30.
[25] Ghandi, B.M.; Nagarajan, R.; Desa, H.; , "Facial emotion detection using GPSO and Lucas-Kanade algorithms," Computer and Communication Engineering (ICCCE), 2010 International Conference on , vol., no., pp.1-6, 11-12 May 2010
[26] Lucey, P.; Cohn, J.F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I.; , "The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression," Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on , vol., no., pp.94-101, 13-18 June 2010
[27] Ruihu Wang; Bin Fang; , "Affective Computing and Biometrics Based HCI Surveillance System," Information Science and Engineering, 2008. ISISE '08. International Symposium on , vol.1, no., pp.192-195, 20-22 Dec. 2008
94