Transcript
Page 1: Emotiv Experimenter: an experimentation and “mind reading ...psy-normanweb.princeton.edu/experimenter/ExperimenterPoster.pdf · The Emotiv Headset The Basic Experiment • Consumer-focused

Trainin

g Ph

ase

Test Ph

ase

Train Classifier

The Basic Experiment The Emotiv Headset

• Consumer-focused EEG device • Intended for brain-computer interface applications • Produces lower-quality data than traditional EEG setups,

but is much cheaper and easier to use

• Develop an application for running neuroscience experiments and collecting EEG data using Emotiv

• Incorporate offline and online analysis of collected data • Validate experiment design, data quality, and application

usability through experiments

Project Goals

Experiment Example Stimulus

Choice 1 Choice 2

Eyes Open vs. Eyes Closed

Keep your eyes open

Close your eyes

Artist vs. Function

Artist: Visualize the noun

Function: try to think of uses for the noun

Faces vs. Math Expressions

Focus on the face: is it male or female?

Evaluate the mathematical expression

Other Experiments

n/a

Data Collection

• An experiment consists of trials, during which the subject viewed either a face or a scene

• The application collects EEG data during each trial

• Some trials are labeled: the subject was told which stimulus to view

• Other trials are unlabeled: the subject chose which stimulus to view

Learning

• Data preprocessing improves signal-to-noise ratio of collected data

• Machine-learning algorithm uses data from labeled trials to learn how to differentiate between face and scene trials using EEG data

Prediction

• Trained learning algorithm is used to classify EEG data from unlabeled trials

• Result is prediction of the subject’s choice, i. e. mind-reading

The Faces vs. Scenes Experiment

“Cable”

1 + (2 * 9) - 3

Data Analysis

Emotiv Experimenter: an experimentation and “mind-reading” application for Emotiv EPOC By: Michael Adelson, Computer Science ’11

Advisor: Ken Norman, PSY/NEU

Conclusion • The Experimenter application makes it easy for anyone to

run a variety of interesting experiments with Emotiv • The collected results suggest that Emotiv’s data is of

sufficient quality for this sort of work • The Experimenter code base provides useful modules for

working with Emotiv which could be used in other research applications

• The Experimenter code, report, and documentation are available at http://www.princeton.edu/~madelson/experimenter

EEG Data • Raw data consists of 1 voltage time series

per headset electrode (14 channels X 128 samples/second)

• Extremely noisy due to connectivity issues, concurrent mental processes, motion artifacts, and even AC current in the mains

Artifact Rejection

• Blinks (right) and other facial movements are a major source of noise

• Goal: detect and discard trials with these “motion artifacts”

• Experimenter does this online and informs the user!

Inst

ruct

ion

s D

isp

lay!

A

tten

tio

n T

ask

Sub

ject

Ch

oo

ses

Dis

pla

y!

Cla

ssif

ier

Pre

dic

ts C

ho

ice

When a subject’s eyes close (above), there are changes in the steady-state properties of the EEG signal (e. g. amplitude, frequency)

By time-locking faces vs. scenes signals to the onset of the stimulus display (above), time-dependent responses to the stimuli can be compared across trials. However, these responses are heavily obscured by noise

Algorithm Requirements • There are typically more features per trial than there are

trials in an experiment, so a learning algorithm must be robust to over-fitting

• To be suitable for online use, an algorithm must be able to train relatively quickly

Algorithms • K-Nearest Neighbors classifies an unlabeled example

by taking the weighted vote of the K labeled examples which are “closest” to the unlabeled example

• Linear Discriminant Analysis finds a projection of multiple EEG channels onto one data channel such that the resulting time series is very different between the two experimental conditions

• Penalized Logistic Regression (PLR) fits the data to a logistic curve while penalizing large regression weights to prevent over-fitting

• AdaBoost intelligently combines the predictions of multiple “weak” learning algorithms for an improved result

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Acc

ura

cy

Data Set

Offline Performance Comparison

AdaBoost VP

AdaBoost DS

PLR

KNN

EmoEngine Mock Data

Source

Poll/Publish Loop

Classifier Raw Data

Logger Connection

Listener

Emotiv Data Stream

The data stream module formats the data recorded by Emotiv and delivers it to various components of the application in real time via a publish/subscribe system

User Interface

• The Experimenter user interface allows for extensive configuration of experiment parameters, stimuli, and classification routines.

• All settings can be saved and reloaded later

Presentation

The presentation module is responsible for displaying the instructions, tasks, and stimuli for an experiment. It uses a slide show-like system which allows the application to respond to both timer events and user input

Emotiv Data Stream

Presentation

User Interface (Experiment

Configuration)

Data Analysis

The Experimenter Application

A comparison of the performance of 4 learning algorithms (above) on data from various faces vs. scenes experiments (each column represents a separate experiment) shows that PLR is consistently the most successful. However, it is also by far the slowest algorithm to train. These results were obtained offline using cross-validation to control for over-fitting

EEG Pattern Classification

The CLASSES are the two

conditions under which EEG data was collected

A LEARNING ALGORITHM

maps features to classes

The input FEATURES are

derived from the raw EEG data

collected during each trial

Experimenter is comprised of 4 modules (left). The black arrows show the flow of data between modules