Upload
luis-anjos
View
169
Download
3
Embed Size (px)
Citation preview
[ MEME] Framework[Mind Experiences & Models Experimenter]
[Abstract]
The Mind Experiences and Models Experimenter [MEME] framework use noninvasive electroencephalography troughcommercial and open devices of brain computer interface, to record information of user’s brain activity in the context of anyspecific experiment analyzing it using machine learning modelsto search for patterns and singular events according theobjectives of the test; [Experiment] is the conceptual module ofthe framework that allow the design, record and play of testcases based in visual, audio and another type of internal orexternal stimulus; [Emotions] was the first experience recorded, the target was predicting “liking” and “disliking” valence andarousal reactions about the aspect of affective pictures; [Music] is the module developed that send notes using musical instrument digital interface, sequencing brain waves values withthe tempo of an editable pad sequencer; also, in auto-frequencymode on, the software automatically translates microvolts signals in sounds according a frequency equivalence table.
Thinking about thinking…
https://prezi.com/c4fygfpd1ssg/research-project-master-um/
http://en.wikipedia.org/wiki/List_of_cognitive_biases
…everybody is singular!
• Reason (make judgment
under uncertainly)
• Consciousness
• Represent knowledge
(also commonsense)
• Learn (critical to human
intelligence)
• Communicate (natural
language)
• Self-awareness, Sentience,
Sapience...
How do I think?
http://en.wikipedia.org/wiki/Cognitive_science
Cognitive
Affective
Conative
Natural tendency, impulse… ?
Emotions
Mind loop = (sensation + perception + action) * emotion
• Sensation: The transformation of
external events into neural activity;
• Perception: Processing of sensory
information; we believe that the end result is a useful representation in terms of the external objects that produced the sensations;
• Action: Organisms use the
representation of the world in order to act on it, optimizing rewards and minimizing punishments;
• Emotion is often the driving force
behind motivation, positive or negative.
Neural processing mechanism
Emotion
Somatic marker hypothesis (SMH) Emotions, as defined by Damasio, are changes in both body and brain states in response to different stimuli.… the somatic marker hypothesis proposes that emotions play a critical role in the ability to make fast, rational decisions in complex and uncertain situations.
http://en.wikipedia.org/wiki/Somatic_marker_hypothesis
Ventromedial
Prefrontal
Cortex
Pattern Recognition Theory of Mind
• Kurzweill describes a series of thought experiments which suggest to him that the brain contains a hierarchy of pattern recognizers. Based on this he introduces his Pattern Recognition Theory of Mind. He says the neocortex contains 300 million very general pattern recognition circuits and argues that they are responsible for most aspects of human thought. He also suggests that the brain is a "recursive probabilistic fractal“…
http://en.wikipedia.org/wiki/How_to_Create_a_Mind
EEG Devices
http://www.openbci.com/http://emotiv.co/
http://www.neurosky.com/
???
Brain Computer Interface
• Any BCI has four components: – Signal Acquisition: getting information from the brain, the
user performs a task that produces a distinct EEG signature for that BCI;
– Signal Processing: translating information to messages or commands;• Feature Extraction: salient features are extracted from the EEG;
– Translation Algorithm: a pattern classification system uses these EEG features to determine which task the user performed;
– Operating Environment: the BCI presents feedback to the user, and forms a message or command;• Devices: robotic devices; raise events or commands in other
systems;
[Framework]
Problem Statement
• Is possible make experiments based in sensorial
action/reaction stimuli to searching into EEG datasets
for singular events or features related with the specific
objective of the experience?
• Is possible detect human emotions from brain signals?
• Is possible hear and see quantified representations of
our thoughts?
Front End
Framework Emotion MusicExperiment
Back EndLanguages: C#, Python, R, Java
[Emotion]
Challenger
– Objective• Design and execute an experiment to could predict a
basic human emotion applying ML algorithms and measuring their confidence through scores; identify basic valences through one source stimuli to record the datasets required for training and testing the models;
– Given• Mind Experience Dataset = Spatial + Energy + Time =
Inputs by sensors live or recorded– Return
• Emotion (Like/Dislike)– Solution space
• (EPOC, max) 14 electrodes x 128 Hz/sec, -70 mVolts to 6000
All sensors localizations of 10-20 system
• Default experiment of the framework; visual stimuli resource type; using the Geneva affective picture emotion database GAPED to predict attraction emotions Liking/Disliking;
• The mind experience:– Collect EEG data from 13 subjects;– Using Emotiv device with 14 electrodes located at AF3, F7, F3, FC5,
T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4. – Using 223 pictures with valence and arousal marks associated,
experience of ~4 min = 30720 records x electrode, 1 frame/sec; – The 3 most important channels AF3, F4 and FC6; (O. SOURINA, Y.
LIU)– (Jones and Fox, 1992, Canli et al., 1998), it was shown that the left
hemisphere was more active during positive emotions, and the right hemisphere was more active during negative emotions.• To test this binary hypothesis, we collected data from AF3 electrode which
is located on left hemisphere and from F4 electrode which is located on the right hemisphere.
Experimental procedure and resources
Recording the mind experience
Boxplot with the raw data summary of all subjects;
the values of the sensors F7, FC5, P7, O2, T8, F4 and AF4 are invalid;
Approach
• Align mind experiences (correction);
• Select DataSet filters according the implicit marks to identify:
– Test data;
– Train data;
• For any ML model tested
– For any singular subject:
• Create ML Model (random forest);
• Calculate score;
Plot with the mean of the raw data summary of all the subjects and Boxplot with the means of all the
subjects classified by positive and negative emotions, raw data summary;
Plot of positive and negative pictures by subject with the maximums, minimums and means of
the AF3 sensor raw data;
Plot of recorded values of all the valid sensors for the once subject with an
emotional transition peak well-defined;
And plot of recorded values only for the AF3 sensor for the
once subject with an emotional transition peak well-defined;
To build random forest model, and following the best practices in the time series analysis of brain waves, was reduced the dimensionality of the raw data to train; the strategy was apply an FFT.
Heatmap of the sensors correlation; the sensors variables AF3, F3, FC6 and F8 are more correlated
Final result of this classifier:
[Experiment]
• Allowing the configuration of specific test cases (experiments) based in visual, audio and another external stimuli, through sequences of images, symbols, sounds and language is possible searching for singular events (marks) applying machine learning algorithms to build models for searching patterns in the datasets recorded with the EEG devices.
Hypothesis
[MEME] loopMind Experiences
Models Experimenter
• Sensation: Signal acquisition from EEG
sensors (live or recorded in EDF format) with “events marks” (M) regarding the parameters of the experience configured (implicit) or manually sent by the user (explicit);
• Perception: Run machine learning
models using the inputs to predict the output (M) ;
• Action: Using an event manager, any time
that the model predict inputs values related with an specific mark associated with the experience, will be triggered a command to could interact with other systems;
• Emotion: Implementation of simple
valence emotion model (inspired in OCC Model).
Framework Components
BCI Sensors Layer Emotiv EPOC SDK/insight
Core Layer
Model Experimenter
Mind Experiences
SignalAcquisitionAdapter
SignalProcessingManager
Application Interface Layer FramesUI
UserProfileManager
ExperiencesManager
EventsManager
ModelManager
Neurosky MindWave
TemplatesFactory
MEME = State of Mind (Score%)
Components description
• User Profile Manager:– CRUD of login related with the citizen scientist;
• Signal Processing Manager:– Allow the dynamic configuration of the input EEG dataset setting the columns
(features) and rows (time) that will be used to train the model;– Apply FFT to the features expressed in raw data reducing the dimensionality of
the input EEG dataset;
• Experience Manager:– Frames UI:
• Design and Edit the parameters of the experiment;– Name and description of the mind experience;– Type of stimuli or task to analyze;– Main sense stimuli;– Total of frames tasks;– Time of any frame task;– BCI Device; – Sensors output (.csv, .edf, nosql cloud DB);– Edit frame task: window form with customized image, audio, video, text setting also how catch
and record specific mouse and keyboard event send by the user;
– Template Factory:• Presets with a library of templates from Frames UI saved experiences;
Components description
• Model Manager:– Library of ML algorithm linked with ironPython and R: nearest neighbor
classifiers, linear classifiers, nonlinear Bayesian classifiers, support vector machine classification, hidden markov models and neural networks;
– Select and setup the algorithm to validate and compare scores;– Use the input EEG dataset recorded to train the model selected from the
library scoring automatically all the possible chunks of data according a specific sampling window related with the objective of the experiment;
• Events Manager:– Record mode:
• Run the Frames UI sequence selected recording the EEG stream from the BCI device;
– Play mode• Run the Frames UI sequence with the recorded EEG stream content and predict the
target of the mind experience according the ML model selected in real time;
– Live mode• Run the Frames UI sequence with the live EEG stream from the BCI device and predict
the target of the mind experience according the ML model selected in real time;
– Add manually marks in recording of the experiences to measure stimulus from other senses (e.g. taste, external events).
[Music]
[MEME] music
Part of the temperament table created for the auto-frequency mode of [Music] that translate automatically the EEG signal in music synchronizing the
natural value (Hz) of brain waves with the note and the octave using two different models based in the difference of the distance;
Exponential difference between the musical notes based in Hz distance;
[Music] next steps:
Conclusions
The framework use a simple and effective approach to record and analyze information of brain activity in the context of practically any action/reaction experiment with a well define and specific target; find patterns into the datasets recorded in the context of the emotional experiment of likes and dislikes was a hard task were was demonstrated with the low score result of the machine learning algorithm applied: random forest; the artistic module implemented regarding the creation of music open a lot of possibilities for musician that searching for a more natural expression on your live performances.
Conclusions
[MEME] framework is in a continuous process of development that could be potentiated with the help of more developer when the source code will be published in an open software repositories; future work and improvements for the next versions: finish the development of the [Experiment] module including the implementation of an automatic method for the selection of the best part of the dataset to train the models; use a cycle that compare scores automatically and avoid the overfit of the model; promote a new public session of dataset recording for the [Emotion] experiment with more than 100 subjects; Improve the user interface of [Music] and start to develop the [Dream] experience.
Vision
Software technologies that mix virtual and augmented reality with brain computer interface represent nowadays the user interfaces of the future; detect human emotions will be the best input for complex systems of affective computing that can, for example, regulate the speed of an autonomous car considering the stress level of the passenger or change the environment of an entire home according the state of mind of the user.
peoplesingularity.com
https://www.facebook.com/MEMEFramework
[bonus]
[trAIck]
Types of Artificial Inteligence
Types of Artificial Inteligence
MEME Dissertation
Review
Types of Artificial Inteligence
Review
Types of Artificial Inteligence
Strong AI definition