57
[ MEME] Framework [Mind Experiences & Models Experimenter]

Mind Experiences Models Experimenter Framework

Embed Size (px)

Citation preview

Page 1: Mind Experiences Models Experimenter Framework

[ MEME] Framework[Mind Experiences & Models Experimenter]

Page 2: Mind Experiences Models Experimenter Framework

[Abstract]

Page 3: Mind Experiences Models Experimenter Framework

The Mind Experiences and Models Experimenter [MEME] framework use noninvasive electroencephalography troughcommercial and open devices of brain computer interface, to record information of user’s brain activity in the context of anyspecific experiment analyzing it using machine learning modelsto search for patterns and singular events according theobjectives of the test; [Experiment] is the conceptual module ofthe framework that allow the design, record and play of testcases based in visual, audio and another type of internal orexternal stimulus; [Emotions] was the first experience recorded, the target was predicting “liking” and “disliking” valence andarousal reactions about the aspect of affective pictures; [Music] is the module developed that send notes using musical instrument digital interface, sequencing brain waves values withthe tempo of an editable pad sequencer; also, in auto-frequencymode on, the software automatically translates microvolts signals in sounds according a frequency equivalence table.

Page 4: Mind Experiences Models Experimenter Framework

Thinking about thinking…

https://prezi.com/c4fygfpd1ssg/research-project-master-um/

Page 5: Mind Experiences Models Experimenter Framework

http://en.wikipedia.org/wiki/List_of_cognitive_biases

Page 6: Mind Experiences Models Experimenter Framework
Page 7: Mind Experiences Models Experimenter Framework
Page 8: Mind Experiences Models Experimenter Framework
Page 9: Mind Experiences Models Experimenter Framework
Page 10: Mind Experiences Models Experimenter Framework

…everybody is singular!

Page 11: Mind Experiences Models Experimenter Framework

• Reason (make judgment

under uncertainly)

• Consciousness

• Represent knowledge

(also commonsense)

• Learn (critical to human

intelligence)

• Communicate (natural

language)

• Self-awareness, Sentience,

Sapience...

How do I think?

http://en.wikipedia.org/wiki/Cognitive_science

Cognitive

Affective

Conative

Natural tendency, impulse… ?

Emotions

Page 12: Mind Experiences Models Experimenter Framework

Mind loop = (sensation + perception + action) * emotion

Page 13: Mind Experiences Models Experimenter Framework

• Sensation: The transformation of

external events into neural activity;

• Perception: Processing of sensory

information; we believe that the end result is a useful representation in terms of the external objects that produced the sensations;

• Action: Organisms use the

representation of the world in order to act on it, optimizing rewards and minimizing punishments;

• Emotion is often the driving force

behind motivation, positive or negative.

Neural processing mechanism

Emotion

Page 14: Mind Experiences Models Experimenter Framework

Somatic marker hypothesis (SMH) Emotions, as defined by Damasio, are changes in both body and brain states in response to different stimuli.… the somatic marker hypothesis proposes that emotions play a critical role in the ability to make fast, rational decisions in complex and uncertain situations.

http://en.wikipedia.org/wiki/Somatic_marker_hypothesis

Ventromedial

Prefrontal

Cortex

Page 15: Mind Experiences Models Experimenter Framework

Pattern Recognition Theory of Mind

• Kurzweill describes a series of thought experiments which suggest to him that the brain contains a hierarchy of pattern recognizers. Based on this he introduces his Pattern Recognition Theory of Mind. He says the neocortex contains 300 million very general pattern recognition circuits and argues that they are responsible for most aspects of human thought. He also suggests that the brain is a "recursive probabilistic fractal“…

http://en.wikipedia.org/wiki/How_to_Create_a_Mind

Page 16: Mind Experiences Models Experimenter Framework

EEG Devices

http://www.openbci.com/http://emotiv.co/

http://www.neurosky.com/

???

Page 17: Mind Experiences Models Experimenter Framework
Page 18: Mind Experiences Models Experimenter Framework
Page 19: Mind Experiences Models Experimenter Framework
Page 20: Mind Experiences Models Experimenter Framework

Brain Computer Interface

• Any BCI has four components: – Signal Acquisition: getting information from the brain, the

user performs a task that produces a distinct EEG signature for that BCI;

– Signal Processing: translating information to messages or commands;• Feature Extraction: salient features are extracted from the EEG;

– Translation Algorithm: a pattern classification system uses these EEG features to determine which task the user performed;

– Operating Environment: the BCI presents feedback to the user, and forms a message or command;• Devices: robotic devices; raise events or commands in other

systems;

Page 21: Mind Experiences Models Experimenter Framework

[Framework]

Page 22: Mind Experiences Models Experimenter Framework

Problem Statement

• Is possible make experiments based in sensorial

action/reaction stimuli to searching into EEG datasets

for singular events or features related with the specific

objective of the experience?

• Is possible detect human emotions from brain signals?

• Is possible hear and see quantified representations of

our thoughts?

Page 23: Mind Experiences Models Experimenter Framework

Front End

Framework Emotion MusicExperiment

Back EndLanguages: C#, Python, R, Java

Page 24: Mind Experiences Models Experimenter Framework

[Emotion]

Page 25: Mind Experiences Models Experimenter Framework

Challenger

– Objective• Design and execute an experiment to could predict a

basic human emotion applying ML algorithms and measuring their confidence through scores; identify basic valences through one source stimuli to record the datasets required for training and testing the models;

– Given• Mind Experience Dataset = Spatial + Energy + Time =

Inputs by sensors live or recorded– Return

• Emotion (Like/Dislike)– Solution space

• (EPOC, max) 14 electrodes x 128 Hz/sec, -70 mVolts to 6000

Page 26: Mind Experiences Models Experimenter Framework

All sensors localizations of 10-20 system

Page 27: Mind Experiences Models Experimenter Framework

• Default experiment of the framework; visual stimuli resource type; using the Geneva affective picture emotion database GAPED to predict attraction emotions Liking/Disliking;

• The mind experience:– Collect EEG data from 13 subjects;– Using Emotiv device with 14 electrodes located at AF3, F7, F3, FC5,

T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4. – Using 223 pictures with valence and arousal marks associated,

experience of ~4 min = 30720 records x electrode, 1 frame/sec; – The 3 most important channels AF3, F4 and FC6; (O. SOURINA, Y.

LIU)– (Jones and Fox, 1992, Canli et al., 1998), it was shown that the left

hemisphere was more active during positive emotions, and the right hemisphere was more active during negative emotions.• To test this binary hypothesis, we collected data from AF3 electrode which

is located on left hemisphere and from F4 electrode which is located on the right hemisphere.

Experimental procedure and resources

Page 28: Mind Experiences Models Experimenter Framework

Recording the mind experience

Page 29: Mind Experiences Models Experimenter Framework

Boxplot with the raw data summary of all subjects;

the values of the sensors F7, FC5, P7, O2, T8, F4 and AF4 are invalid;

Page 30: Mind Experiences Models Experimenter Framework

Approach

• Align mind experiences (correction);

• Select DataSet filters according the implicit marks to identify:

– Test data;

– Train data;

• For any ML model tested

– For any singular subject:

• Create ML Model (random forest);

• Calculate score;

Page 31: Mind Experiences Models Experimenter Framework

Plot with the mean of the raw data summary of all the subjects and Boxplot with the means of all the

subjects classified by positive and negative emotions, raw data summary;

Page 32: Mind Experiences Models Experimenter Framework

Plot of positive and negative pictures by subject with the maximums, minimums and means of

the AF3 sensor raw data;

Page 33: Mind Experiences Models Experimenter Framework

Plot of recorded values of all the valid sensors for the once subject with an

emotional transition peak well-defined;

And plot of recorded values only for the AF3 sensor for the

once subject with an emotional transition peak well-defined;

Page 34: Mind Experiences Models Experimenter Framework

To build random forest model, and following the best practices in the time series analysis of brain waves, was reduced the dimensionality of the raw data to train; the strategy was apply an FFT.

Page 35: Mind Experiences Models Experimenter Framework

Heatmap of the sensors correlation; the sensors variables AF3, F3, FC6 and F8 are more correlated

Final result of this classifier:

Page 36: Mind Experiences Models Experimenter Framework

[Experiment]

Page 37: Mind Experiences Models Experimenter Framework

• Allowing the configuration of specific test cases (experiments) based in visual, audio and another external stimuli, through sequences of images, symbols, sounds and language is possible searching for singular events (marks) applying machine learning algorithms to build models for searching patterns in the datasets recorded with the EEG devices.

Hypothesis

Page 38: Mind Experiences Models Experimenter Framework

[MEME] loopMind Experiences

Models Experimenter

• Sensation: Signal acquisition from EEG

sensors (live or recorded in EDF format) with “events marks” (M) regarding the parameters of the experience configured (implicit) or manually sent by the user (explicit);

• Perception: Run machine learning

models using the inputs to predict the output (M) ;

• Action: Using an event manager, any time

that the model predict inputs values related with an specific mark associated with the experience, will be triggered a command to could interact with other systems;

• Emotion: Implementation of simple

valence emotion model (inspired in OCC Model).

Page 39: Mind Experiences Models Experimenter Framework

Framework Components

BCI Sensors Layer Emotiv EPOC SDK/insight

Core Layer

Model Experimenter

Mind Experiences

SignalAcquisitionAdapter

SignalProcessingManager

Application Interface Layer FramesUI

UserProfileManager

ExperiencesManager

EventsManager

ModelManager

Neurosky MindWave

TemplatesFactory

Page 40: Mind Experiences Models Experimenter Framework

MEME = State of Mind (Score%)

Page 41: Mind Experiences Models Experimenter Framework

Components description

• User Profile Manager:– CRUD of login related with the citizen scientist;

• Signal Processing Manager:– Allow the dynamic configuration of the input EEG dataset setting the columns

(features) and rows (time) that will be used to train the model;– Apply FFT to the features expressed in raw data reducing the dimensionality of

the input EEG dataset;

• Experience Manager:– Frames UI:

• Design and Edit the parameters of the experiment;– Name and description of the mind experience;– Type of stimuli or task to analyze;– Main sense stimuli;– Total of frames tasks;– Time of any frame task;– BCI Device; – Sensors output (.csv, .edf, nosql cloud DB);– Edit frame task: window form with customized image, audio, video, text setting also how catch

and record specific mouse and keyboard event send by the user;

– Template Factory:• Presets with a library of templates from Frames UI saved experiences;

Page 42: Mind Experiences Models Experimenter Framework

Components description

• Model Manager:– Library of ML algorithm linked with ironPython and R: nearest neighbor

classifiers, linear classifiers, nonlinear Bayesian classifiers, support vector machine classification, hidden markov models and neural networks;

– Select and setup the algorithm to validate and compare scores;– Use the input EEG dataset recorded to train the model selected from the

library scoring automatically all the possible chunks of data according a specific sampling window related with the objective of the experiment;

• Events Manager:– Record mode:

• Run the Frames UI sequence selected recording the EEG stream from the BCI device;

– Play mode• Run the Frames UI sequence with the recorded EEG stream content and predict the

target of the mind experience according the ML model selected in real time;

– Live mode• Run the Frames UI sequence with the live EEG stream from the BCI device and predict

the target of the mind experience according the ML model selected in real time;

– Add manually marks in recording of the experiences to measure stimulus from other senses (e.g. taste, external events).

Page 43: Mind Experiences Models Experimenter Framework

[Music]

Page 44: Mind Experiences Models Experimenter Framework

[MEME] music

Page 45: Mind Experiences Models Experimenter Framework

Part of the temperament table created for the auto-frequency mode of [Music] that translate automatically the EEG signal in music synchronizing the

natural value (Hz) of brain waves with the note and the octave using two different models based in the difference of the distance;

Page 46: Mind Experiences Models Experimenter Framework

Exponential difference between the musical notes based in Hz distance;

Page 47: Mind Experiences Models Experimenter Framework

[Music] next steps:

Page 48: Mind Experiences Models Experimenter Framework

Conclusions

The framework use a simple and effective approach to record and analyze information of brain activity in the context of practically any action/reaction experiment with a well define and specific target; find patterns into the datasets recorded in the context of the emotional experiment of likes and dislikes was a hard task were was demonstrated with the low score result of the machine learning algorithm applied: random forest; the artistic module implemented regarding the creation of music open a lot of possibilities for musician that searching for a more natural expression on your live performances.

Page 49: Mind Experiences Models Experimenter Framework

Conclusions

[MEME] framework is in a continuous process of development that could be potentiated with the help of more developer when the source code will be published in an open software repositories; future work and improvements for the next versions: finish the development of the [Experiment] module including the implementation of an automatic method for the selection of the best part of the dataset to train the models; use a cycle that compare scores automatically and avoid the overfit of the model; promote a new public session of dataset recording for the [Emotion] experiment with more than 100 subjects; Improve the user interface of [Music] and start to develop the [Dream] experience.

Page 50: Mind Experiences Models Experimenter Framework

Vision

Software technologies that mix virtual and augmented reality with brain computer interface represent nowadays the user interfaces of the future; detect human emotions will be the best input for complex systems of affective computing that can, for example, regulate the speed of an autonomous car considering the stress level of the passenger or change the environment of an entire home according the state of mind of the user.

Page 51: Mind Experiences Models Experimenter Framework

peoplesingularity.com

https://www.facebook.com/MEMEFramework

Page 52: Mind Experiences Models Experimenter Framework

[bonus]

[trAIck]

Page 53: Mind Experiences Models Experimenter Framework

Types of Artificial Inteligence

Page 54: Mind Experiences Models Experimenter Framework

Types of Artificial Inteligence

Page 55: Mind Experiences Models Experimenter Framework

MEME Dissertation

Review

Types of Artificial Inteligence

Page 56: Mind Experiences Models Experimenter Framework

Review

Types of Artificial Inteligence

Page 57: Mind Experiences Models Experimenter Framework

Strong AI definition