6
A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications Carmina E. Reyes, Janine Lizbeth C. Rugayan, Carl Jason G. Rullan, Carlos M. Oppus ECCE Department Ateneo de Manila University Quezon City, Philippines [email protected] Gregory L. Tangonan Ateneo Innovation Center Ateneo de Manila University Quezon City, Philippines AbstractThis work aims to analyze the EEG signals produced by facial gestures and eye movements called artifacts. Although these signals are considered contaminants in EEG signals used for medical diagnosis, these are observed in order to consider the possibility of using them as inputs for certain applications. As such, the project aims to observe distinct signal patterns in the EEG signals acquired for certain facial gestures as a preliminary work to facial gesture detection. Using the Emotiv Epoc Neuroheadset, the cross correlation between pairs of 14 channels for six facial gestures and their frequency response are compared. These facial gestures are blink, left wink, right wink, raise brow, smile, and clench. Particular channel pairs are found to be highly correlated for certain facial gestures and can be used as possible means of detecting these gestures. In the frequency domain, only the gestures smile and clench registered a distinctive frequency response among the other gestures. Moreover, the Emotiv Epoc neuroheadset paired with the Arduino Duemilanove board was found to be an effective tool as a controller for household appliances. Also, the neuroheadset was useful in developing an extended communication platform. As such not only does it prove to be a viable device for developing systems in aiding the physically-challenged, but also provides a glimpse of the potential advances in the field of Brain- Computer Interfaces. Keywords-EEG signals, Neuroheadset, facial artifacts, BCI I. INTRODUCTIO This study aims to observe and analyze raw EEG signals acquired from a neuroheadset in order to identify significant relationships between these signals and distinct facial gestures namely blink, wink right, wink left, smile, clench and raise brow. Additional conditions were also observed namely swallowing, sticking one’s tongue out and pinching person’s both cheeks. Also, the result of this study is geared towards brain-computer interfacing through using the detection of the aforementioned facial gestures as input for a home control system and communicator for disabled patients. Physically challenged individuals need extra effort in order to perform daily chores or travel from one place to another, for example, using a wheelchair wherein it requires sufficient arm muscle strength. This study has the potential to aid them or even senior citizens to perform everyday tasks without exerting too much effort The EEG signals were analyzed through two methods. First, the linear correlation coefficients of each channel with all the other channels for every facial gesture were computed. Also, the signals were observed in the frequency domain using Fourier analysis. As it was said, the facial gestures together with additional conditions observed were limited to blink, wink right, wink left, smile, clench, raise brow, swallowing, sticking tongue out and pinching the cheeks. The Arduino Duemilanove board paired with the neuroheadset was used to develop a control for three household appliances. These appliances were either turned on and off by driving the solid state relays to either high or low depending on the input from the neuroheadset. Also, an application was developed allowing persons with limited communications skills to be able to communicate with the people around them through associating facial gestures with audio files played via a computer. II. THEORETICAL BACKGROUND EEG measures the current flows during a synaptic excitation of the dendrites of many pyramidal neurons in the cerebral cortex; these current flows are produced when neurons are activated. In order for electrical activity to be recordable, it must be generated by large populations of active neurons. The current penetrates through the skin and skull which are picked up by the electrodes and amplified.[1] Signals generated from eye movements and facial muscle movements are considered to be noise when considering pure EEG signals alone. They are also called artifacts, and are regarded as contaminants to EEG data especially when these data are used for clinical diagnosis applications. There are two artifacts that are observed in this study. First, Electroocculogram (EOG) is a signal generated from the electric field around the eye. It has two dimensions namely vertical and horizontal. On the other hand, Electromyogram (EMG) signals are large amplitude electrical signals generated by facial muscle movements.[2] For analysis, this study used Discrete Fourier Transform to observe the signals in the frequency domain and Pearson Product-Moment Correlation Coefficient to quantify the correlation between the signals. Discrete Fourier transforms are extremely useful because they reveal periodicity in input data as well as the relative strengths of any periodic components. The discrete Fourier transform of areal sequence of numbers will be a sequence of complex numbers of the same length.[3]

A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications

Carmina E. Reyes, Janine Lizbeth C. Rugayan, Carl Jason

G. Rullan, Carlos M. Oppus ECCE Department

Ateneo de Manila University Quezon City, Philippines

[email protected]

Gregory L. Tangonan Ateneo Innovation Center

Ateneo de Manila University Quezon City, Philippines

Abstract—This work aims to analyze the EEG signals produced by facial gestures and eye movements called artifacts. Although these signals are considered contaminants in EEG signals used for medical diagnosis, these are observed in order to consider the possibility of using them as inputs for certain applications. As such, the project aims to observe distinct signal patterns in the EEG signals acquired for certain facial gestures as a preliminary work to facial gesture detection. Using the Emotiv Epoc Neuroheadset, the cross correlation between pairs of 14 channels for six facial gestures and their frequency response are compared. These facial gestures are blink, left wink, right wink, raise brow, smile, and clench. Particular channel pairs are found to be highly correlated for certain facial gestures and can be used as possible means of detecting these gestures. In the frequency domain, only the gestures smile and clench registered a distinctive frequency response among the other gestures. Moreover, the Emotiv Epoc neuroheadset paired with the Arduino Duemilanove board was found to be an effective tool as a controller for household appliances. Also, the neuroheadset was useful in developing an extended communication platform. As such not only does it prove to be a viable device for developing systems in aiding the physically-challenged, but also provides a glimpse of the potential advances in the field of Brain-Computer Interfaces.

Keywords-EEG signals, Neuroheadset, facial artifacts, BCI

I. INTRODUCTIO This study aims to observe and analyze raw EEG signals acquired from a neuroheadset in order to identify significant relationships between these signals and distinct facial gestures namely blink, wink right, wink left, smile, clench and raise brow. Additional conditions were also observed namely swallowing, sticking one’s tongue out and pinching person’s both cheeks. Also, the result of this study is geared towards brain-computer interfacing through using the detection of the aforementioned facial gestures as input for a home control system and communicator for disabled patients.

Physically challenged individuals need extra effort in order to perform daily chores or travel from one place to another, for example, using a wheelchair wherein it requires sufficient arm muscle strength. This study has the potential to aid them or even senior citizens to perform everyday tasks without exerting too much effort

The EEG signals were analyzed through two methods. First, the linear correlation coefficients of each channel with all the other channels for every facial gesture were computed. Also,

the signals were observed in the frequency domain using Fourier analysis. As it was said, the facial gestures together with additional conditions observed were limited to blink, wink right, wink left, smile, clench, raise brow, swallowing, sticking tongue out and pinching the cheeks.

The Arduino Duemilanove board paired with the neuroheadset was used to develop a control for three household appliances. These appliances were either turned on and off by driving the solid state relays to either high or low depending on the input from the neuroheadset. Also, an application was developed allowing persons with limited communications skills to be able to communicate with the people around them through associating facial gestures with audio files played via a computer.

II. THEORETICAL BACKGROUND EEG measures the current flows during a synaptic excitation

of the dendrites of many pyramidal neurons in the cerebral cortex; these current flows are produced when neurons are activated. In order for electrical activity to be recordable, it must be generated by large populations of active neurons. The current penetrates through the skin and skull which are picked up by the electrodes and amplified.[1]

Signals generated from eye movements and facial muscle movements are considered to be noise when considering pure EEG signals alone. They are also called artifacts, and are regarded as contaminants to EEG data especially when these data are used for clinical diagnosis applications.

There are two artifacts that are observed in this study. First, Electroocculogram (EOG) is a signal generated from the electric field around the eye. It has two dimensions namely vertical and horizontal. On the other hand, Electromyogram (EMG) signals are large amplitude electrical signals generated by facial muscle movements.[2]

For analysis, this study used Discrete Fourier Transform to observe the signals in the frequency domain and Pearson Product-Moment Correlation Coefficient to quantify the correlation between the signals.

Discrete Fourier transforms are extremely useful because they reveal periodicity in input data as well as the relative strengths of any periodic components. The discrete Fourier transform of areal sequence of numbers will be a sequence of complex numbers of the same length.[3]

Page 2: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

Equation 1. Discrete Fourier Transform

The correlation coefficient or also known as Pearson Product-Moment Correlation Coefficient is a value that characterizes the degree and direction of linear relationship between two variables.[4]

The range of its values is from -1.0 to +1.0 wherein the signal dictates the interpretation of the resulting value. For a positive correlation coefficient, it means that the two variables vary in the same direction. If the correlation coefficient is negative, then the two variables have an inverse relationship wherein as one increases the other decreases or vice versa. A value of zero is interpreted as the absence of a relationship between the two variables.[5]

Equation 2. Correlation coefficient

As seen in the equation above, the linear correlation coefficient is denoted by the variable r which characterizes the correlation between the variables x and y. The summation and the mean values x and y are dependent on the sample size chosen to be interpreted.

III. METHODOLOGY

Hardware: This study used the Emotiv Epoc neuroheadset. It is a non-invasive high resolution, neuro-signal acquisition and processing wireless neuroheadset containing 14 saline sensors. The output data have a DC offset of about 4000µV. The headset connects to a computer via Bluetooth using a USB dongle.

Figure 1. The Emotiv Epoc Neuroheadset [6]

Emotiv Epoc Neuroheadset electrode placement system is derived from the 10-20 Electrode Placement standards. The headset consists of 16 electrodes where CMS (Common Mode Sense) and DRL (Driven Right Leg) are reference electrodes.

We also used the Arduino Duemilanove board that includes an ATmega328 for its microcontroller. The board itself houses

Figure 2. Electrode Placement System of Emotiv Epoc

6 analog inputs and 14 digital pins that can be configured either as input or output pins. It can easily be connected to the computer using the USB connection. It can be powered up either using an external power supply through the power jack or through the USB connection. The recommended ran ge for the external supply of the board is 7 to 12 volts.[7]

Solid State Relay (SSR) is an ON-OFF control device wherein small control signal controls a larger load current or voltage, and is ideal to demonstrate the actual controlling of appliances using the system. The SSR receives the small control signal from the EPOC headset, through the Arduino, and then handles the switching for the devices that are plugged into it.

Software: Since the raw EEG data acquired from the application developed using the API is still unprocessed by the engine, it contains a DC offset of around 4000 points. It was suggested by Emotiv that users can either implement a running high-pass filter or subtract a running average off the data to be able to get the actual measured level.[8]

For this study, a running average of the raw EEG data was subtracted from itself prior to being analyzed.

Microsoft Excel’s function CORREL was used to compute for the correlation coefficient between two data sets. It was done repeatedly for all six facial gestures. Each of the channel’s correlation with all the remaining 13 channels was observed. The syntax of the function is CORREL(array1, array2) wherein array1and array2 are the sample data sets of the two EEG channels being observed for correlation.

The application program is developed using Microsoft Visual C++ Studio Express and the API included with the neuroheadset. It is used to store the raw EEG data, control appliances through the Arduino, and for patient communications interface.

EEG signals of the six facial gestures were recorded from 12 individuals (6 from each gender) using the Application Program. The facial gestures requested from the subjects to perform were blink, right eye wink, left eye wink, smile, clench, and raise brow. Each individual was asked to perform only one facial gesture at a time and whose data was stored in separate .csv files. Putting a tongue out, pinching cheeks and feeling of pain were also recorded. But for this paper we will focus on blink, smile, swallow and tongue out.

IV. RESULTS AND DISCUSSION

For every facial gesture, a matrix of correlation coefficient values was constructed. This was done repeatedly for all the

Page 3: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

12 subjects. In order to analyze the general trend of these values in each case, the average of all the subjects’ correlation coefficient matrices was computed. Moreover, the graphs presented in this chapter were subjected to a correlation coefficient threshold value of 0.5 to better visualize the significant signatures that were observed for every facial gesture. In an attempt to identify a signature brain signal pattern for every facial gesture, the top 3 channels with the highest correlation values between them were picked out.

Also, following the computation for the correlation, the EEG signals were also observed in the frequency domain using Fourier Analysis.

Blink. The raw EEG signal for the facial gesture blink when performed by Subject 01 is shown in figure 3.

In the figure 4, chart of the blink correlation coefficient values, it is evident that specific channels demonstrated relatively higher values compared to the other channels.

The one with the highest correlation was between channels F4 and F3 with the value of 0.8579. Following this pair was F8 and AF3 with the correlation coefficient of 0.8242. Lastly, a correlation coefficient of 0.8153 was computed for channels AF4 and AF3. Symmetry can be observed between the values computed, that is, between the F4-F3 pair and the AF4-AF3 pair whose signals were from electrodes placed on the forehead and nearest to the eyes among the other electrodes.

-150

-100

-50

0

50

100

150

200

COU

NTE

R 7 15 23 31 39 47 55 63 71 79 87 95 103

111

119

127 6 14 22 30 38 46 54 62 70 78 86 94 102

110

118

126

Blink - 01

AF3 Value

F7 value

F3 Value

FC5 Value

T7 Value

P7 Value

O1 Value

O2 Value

P8 Value

T8 Value

FC6 Value

F4 Value

F8 Value

AF4 Value

Figure 3. Blink EEG Signal of Subject 01

Figure 4. Blink Correlation Coefficient Values

-150

-100

-50

0

50

100

150

200

COU

NTE

R 8 17 26 35 44 53 62 71 80 89 98 107

116

125 5 14 23 32 41 50 59 68 77 86 95 104

113

122

Blink - 01

AF3 Value

F3 Value

F4 Value

F8 Value

AF4 Value

Figure 5. Correlated Blink EEG Signals of Subject 01

It can be seen from the graph above that indeed channels

AF3-AF4, F3-F4 and AF3-F8 were channels that demonstrated the same trend. Using the findings from the average correlation coefficient values for the facial gesture, the channels showing the same change in amplitude over the sample data set for Subject 01 was successfully extracted.

The frequency response for blink does not give any significant consistent behavior from all the sensors among the 12 subjects but in more than three subjects, the F8 sensor has the highest magnitude or among the highest in frequencies less than 5 Hz.

Figure 6. Frequency Response of Blink of Test Subject 08

Figure 7. Frequency Response of Blink of Test Subject 2

In other instances, the sensors AF3 and AF4 have higher

magnitude than the rest at the lower frequencies. The data gathered for the facial gesture smile resulted to

generally low correlation coefficient values (see Figure 8). The highest ones were from AF4 and AF3 with a value of 0.6587. Similar to the three eye movements previously discussed, O1 and O2 maintained the same behavior resulting

Page 4: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

to a value of 0.6398. The next highest value was at 0.5786 between channels F3 and AF3.

-150

-100

-50

0

50

100

150

200

250

COU

NTE

R 14 29 44 59 74 89 104

119 5 20 35 50 65 80 95 110

125 11 26 41 56 71 86 101

116 2 17 32 47 62 77 92 107

122

Smile - 01

AF3 Value

F7 value

F3 Value

FC5 Value

T7 Value

P7 Value

O1 Value

O2 Value

P8 Value

T8 Value

FC6 Value

F4 Value

F8 Value

AF4 Value

Figure 8. Smile EEG Signals for Subject 01

Figure 9. Smile Correlation Coefficient Values

-60

-40

-20

0

20

40

60

80

COU

NTE

R 14 29 44 59 74 89 104

119 5 20 35 50 65 80 95 110

125 11 26 41 56 71 86 101

116 2 17 32 47 62 77 92 107

122

Smile - 01

AF3 Value

F3 Value

O1 Value

O2 Value

AF4 Value

Figure 10. Correlated Smile EEG Signals for Subject 01

As shown above, the signals left after taking out those

whose correlation values were relatively low did not have an obvious and discernible difference from the rest of the signals. (see Figure 8 & Figure 10)

Moreover, this gesture shows significant magnitude at the range of 10 Hz to 50 Hz, compare to the previous gestures where at this frequency range the magnitude is negligible. In most of the frequency responses graph, the signal from the sensor at T8 is higher than the rest of the sensors at frequencies between 10 Hz to 50 Hz. In particular, T7 is significantly higher in that frequency range, for test subject 14, however T8 also remains high.

Figure 11. Frequency Response of Smile of Test Subject 09

Figure 12. Frequency Response of Smile of Test Subject 12

Figure 13. Smile Frequency Response of Channel T8 of Test Subject 09

Figure 14. Smile Frequency Response of Channel T7 of Test Subject 12

-300

-200

-100

0

100

200

300

400

COU

NTE

R 10 21 32 43 54 65 76 87 98 109

120 2 13 24 35 46 57 68 79 90 101

112

123 5 16 27 38 49 60 71 82 93 104

115

126 8 19 30 41 52 63 74 85 96 107

118

Swallow

AF3 Value

F7 value

F3 Value

FC5 Value

T7 Value

P7 Value

O1 Value

O2 Value

P8 Value

T8 Value

FC6 Value

F4 Value

F8 Value

AF4 Value

Figure 15. Swallow EEG Signals

Page 5: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

The subject was asked to swallow once and the EEG signals

were recorded. The graph below shows that the highest correlation came from FC6 and FC5 with a value of 0.9543, followed by F8 and F7 with 0.8819 as its correlation coefficient. Lastly, P8 and O2 also exhibited a high correlation having a value of 0.8351.

It can be observed that there is symmetry between the two highest values. The FC5-FC6 pair and F7-F8 pair were channels whose values came from electrodes placed on the frontal lobe of the brain.

Figure 16. Swallow Correlation Coefficient Values

-300

-200

-100

0

100

200

300

400

COU

NTE

R 10 21 32 43 54 65 76 87 98 109

120 2 13 24 35 46 57 68 79 90 101

112

123 5 16 27 38 49 60 71 82 93 104

115

126 8 19 30 41 52 63 74 85 96 107

118

Swallow

F7 value

FC5 Value

O2 Value

P8 Value

FC6 Value

F8 Value

Figure 17. Correlated Swallow EEG Signals

The graph above shows the channel pairs with the highest

correlation coefficient values. The peak value occurs at F7 and has a registered value of 338.925µV. A. Sticking Tongue Out

The subject was asked to stick his tongue out three times. It was evident from the computed correlation values that symmetry can also be observed. The highest value which was 0.9675 was from the channel pair T8 and T7. It was followed by that from FC6 and FC5 with a value of 0.8603 and those from F4 and F3 with a value of 0.8105. Close enough to this pair was that from AF4 and AF3, with a computed value 0.8095.

The channels to give the highest values were from electrodes placed on the temporal lobe. The graph of the raw EEG signals that exhibited high correlation, as shown below,

demonstrated that these channel pairs in fact had similar trends. Furthermore the average peak values for T8 was about 244.86µV and for T7 was about 188.96µV in this particular sample data.

-150

-100

-50

0

50

100

150

200

250

300

COU

NTE

R 16 33 50 67 84 101

118 6 23 40 57 74 91 108

125 13 30 47 64 81 98 115 3 20 37 54 71 88 105

122 10 27 44 61 78 95 112 0 17 34 51 68 85 102

119

Sticking Tongue Out

AF3 Value

F7 value

F3 Value

FC5 Value

T7 Value

P7 Value

O1 Value

O2 Value

P8 Value

T8 Value

FC6 Value

F4 Value

F8 Value

AF4 Value

Figure 18. Sticking Tongue Out EEG Signals

Figure 19. Sticking Tongue Out Correlation Coefficient Values

Figure 20. Correlated Sticking Tongue Out EEG Signals

Comparing the graph above and that of in the start of the

section, it can be seen that the signals identified to have high correlation were the ones that exhibited a distinguishable pattern among the other signals.

B. Home Control Application

Application-wise, a lamp shade, an electric fan and LCD monitor were successfully controlled using the neuroheadset. The gestures are clench, smile, and raised brow. The program

Page 6: A Study on Ocular and Facial Muscle Artifacts in EEG Signals for … Study on Ocular and... · 2013-02-13 · The headset connects to a computer via Bluetooth using a USB dongle

was able to connect to the Arduino, which in turn was able to output a low or a high into the SSRs depending on the detected facial gesture. Detection of the facial gestures relied on the Neuroheadset’s API. Transmission of signal from the headset to the computer had some delay, adding up the delay of detection and the communication of signal to the Arduino, it results to the overall delay of the whole setup to be as long as a second. Still, the resulting duration does not hamper the intended application where a second delay is not crucial. C. Patient Communicator

On the other hand, the Patient Communicator application intends to send a message using the same gestures used for appliances control. When the program detects the facial gesture clench, it plays the help.wav file which contains a male voice shouting for “help” twice. The message for smile is a wav file where the same male voice says “thank you” while raised brows plays a wav file where it says that the user is hungry.

These applications are both intended for people with limited communication and motor skills. It can be considered a helpful aid for patients suffering from such conditions. As such, these can be deployed to hospitals or to ordinary households in order to improve care for such patients.

V. CONCLUSION AND RECOMMENDATIONS From the data acquired from the Emotiv Epoc neuroheadset,

specific channels were found to be highly correlated for the gestures and conditions that were chosen to be observed. Although, this study have been able to identify these specific channels that can be used as identifier of the execution of the discussed gestures and conditions, it does not provide a solid basis or standard to be able to fully classify these EEG signals since only a limited number of conditions were observed. Furthermore, it should be also noted that through this paper, certain channels such as O1 and O2 were found not entirely suitable as means of classifying EEG signals, since they demonstrated a high average correlation value for most of the gestures observed. For the Fourier analysis, only the gestures smile and clench were the ones that showed significant values and it was between the frequencies 10Hz and 50Hz.

On the other hand, application-wise the neuroheadset was found to be an effective tool for developing a system geared towards assisting persons with disabilities through household appliances control and providing an extended communication platform.

From the 14 sensors using correlation and frequency response can not conclusively identify facial gesture signatures. It is recommended to get more data and to go further by getting a three segment cross correlation. Future work, should try pain, happiness and other feelings as well.

For the program, added delays reduced the sudden changes in detection of facial gestures (rapid on or off), especially if it is intended for controls.

ACKNOWLEDGMENT This research was supported by the Engineering Research

and Development for Technology, Department of Science and Technology - Science Education Institute of the Philippines (ERDT-DOST-SEI). The researchers are grateful to Maria Mercedes T. Rodrigo for introducing them to the neuroheadset technology.

REFERENCES [1] Teplan, Michael. “Fundamentals of EEG Measurement” Measurement

Science Review. Volume 2. Section 2, 2002. [2] Real time ocular and facial muscle artifacts removal from EEG signals

using LMS adaptive algorithm, IEEE Xplore Digital Library. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4658583&tag=1.

[3] Weisstein, Eric W. “Discrete Fourier Transform” MathWorld—A Wolfram Web Resource. http://mathworld.wolfram.com/DiscreteFourierTransform.html (accessed February 07 2011).

[4] Correlation. http://luna.cas.usf.edu/~mbrannic/files/regression/corr1.html (accessed February 09 2011).

[5] Stockburger, David. Correlation. http://www.psychstat.missouristate.edu/introbook/sbk17.htm (accessed February 09 2011).

[6] Emotiv. http://www.emotiv.com/index.php (accessed February 08 2011). [7] Arduino Duemilanove.

http://www.arduino.cc/en/Main/ArduinoBoardDuemilanove (accessed February 11 2011).

[8] Research SDK EEG Output. http://www.emotivsystems.com/forum/forum15/topic839/ (accessed February 11 2011).