Next Article in Journal
Adaptive Robust Unscented Kalman Filter for AUV Acoustic Navigation
Next Article in Special Issue
3D Hermite Transform Optical Flow Estimation in Left Ventricle CT Sequences
Previous Article in Journal
Unmanned Aerial Vehicle-Borne Sensor System for Atmosphere-Particulate-Matter Measurements: Design and Experiments
Previous Article in Special Issue
Heartbeat Sound Signal Classification Using Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Electroencephalogram Profiles for Emotion Identification over the Brain Regions Using Spectral, Entropy and Temporal Biomarkers

by
Noor Kamal Al-Qazzaz
1,2,*,
Mohannad K. Sabir
1,
Sawal Hamid Bin Mohd Ali
2,
Siti Anom Ahmad
3,4 and
Karl Grammer
5
1
Department of Biomedical Engineering, Al-Khwarizmi College of Engineering, University of Baghdad, Baghdad 47146, Iraq
2
Department of Electrical, Electronic & Systems Engineering, Faculty of Engineering & Built Environment, Universiti Kebangsaan Malaysia, UKM Bangi, Selangor 43600, Malaysia
3
Department of Electrical and Electronic Engineering, Faculty of Engineering, Universiti Putra Malaysia, UPM Serdang, Selangor 43400, Malaysia
4
Malaysian Research Institute of Ageing (MyAgeing), Universiti Putra Malaysia, Serdang, Selangor 43400, Malaysia
5
Department of Evolutionary Anthropology, University of Vienna, Althan strasse 14, A-1090 Vienna, Austria
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(1), 59; https://doi.org/10.3390/s20010059
Submission received: 24 October 2019 / Revised: 28 November 2019 / Accepted: 3 December 2019 / Published: 20 December 2019
(This article belongs to the Special Issue Biomedical Signal Processing)

Abstract

:
Identifying emotions has become essential for comprehending varied human behavior during our daily lives. The electroencephalogram (EEG) has been adopted for eliciting information in terms of waveform distribution over the scalp. The rationale behind this work is twofold. First, it aims to propose spectral, entropy and temporal biomarkers for emotion identification. Second, it aims to integrate the spectral, entropy and temporal biomarkers as a means of developing spectro-spatial ( S S ) , entropy-spatial ( E S ) and temporo-spatial ( T S ) emotional profiles over the brain regions. The EEGs of 40 healthy volunteer students from the University of Vienna were recorded while they viewed seven brief emotional video clips. Features using spectral analysis, entropy method and temporal feature were computed. Three stages of two-way analysis of variance (ANOVA) were undertaken so as to identify the emotional biomarkers and Pearson’s correlations were employed to determine the optimal explanatory profiles for emotional detection. The results evidence that the combination of applied spectral, entropy and temporal sets of features may provide and convey reliable biomarkers for identifying S S , E S and T S profiles relating to different emotional states over the brain areas. EEG biomarkers and profiles enable more comprehensive insights into various human behavior effects as an intervention on the brain.

Graphical Abstract

1. Introduction

Within the brain, impetus inclinations, behavioral reactions, physiological stimulation, states of mind and cognitive procedures are all directly conveyed through emotion. Brain activity and neural pathways are interrelated in a manner that influences mathematical, verbal, perceptive and other forms of intelligence, which further shape emotions [1,2]. From a particular response of the body to an instinctive reaction, individual emotional reactions can vary [3]. Accordingly, the possible extent of congruence between socio-affective circumstances and particular brain areas has been investigated through applying an array of simulation methods in a substantial number of studies [4,5].
The dimensional and discrete models are the two affective models that may be adopted to determine and categorize affective states in accordance with a psychological perspective. Key emotions are identified in the discrete model which specific, distinct affective states are connected to; fundamental emotions such as happiness, sadness, surprise, disgust, fear and anger are individually or in some mixture deemed to be responsible for any further emotions [6,7]. Adopting a circumplex emotion model, the dimensional model has been pervasively adopted for affective identification application mapping, as a two-dimensional (2D) cognitive-emotional state theory [8,9]. A two-scale valence-arousal graph is used for conveying emotions, with emotional strength between calm to excited presented on the vertical axis as ‘arousal’, while the unpleasant to pleasant degree of emotion is conveyed on the horizontal axis as ‘valence’ [8,9,10]. Furthermore, quartiles Q1 to Q4 are the four principal sites of emotional states, with low arousal–high valence (LAHV), low arousal–low valence (LALV), high arousal–low valence (HALV) and high arousal–high valence (HAHV) presented in Q4, Q3, Q2 and Q1 respectively [11]. Beyond the 2D model, focus-disinterest characteristics have been incorporated into a three-dimensional (3D) cognitive-emotional state model in certain studies [12].
Various emotional states have been produced through the adoption of varied techniques in studies. Consequently, audio-visual, auditory or visual stimuli have all been adopted in different instances. As in brain-computer interfacing (BCI) research, diminishing or expanding the sensorimotor rhythm amplitude is the process for auditory and visual stimuli [13]. Additionally, compared to music-based audio stimuli, brain signals more straightforwardly convey picture-based visual stimuli [14]. So as to provoke a particular affective state in the most effective manner, auditory and visual stimulus’ amalgamated impact has been acknowledged in studies, thus establishing the optimal context for affective identification. Regarding automatic emotion identification, audio-visual stimuli have also been applied [4]. In contrast with alternative stimuli methods, audio-visual production of emotional states has been found to be superior and more pervasively adopted [13,15,16]. Therefore, brief audio-visual film excerpts were adopted to elicit emotion in this research.
Emotional changes would be elicited using different physiological signals such as galvanic skin response (GSR) [15], electrodermal activity (EDA) [17], blood volume pressure (BVP) [18], and skin temperature (ST) [19], evoked potentials (EP) [20], electrocardiogram (ECG) [21], electromyogram (EMG) [22], and electroencephalogram (EEG) [23,24,25,26,27,28,29,30]. Clinically, EEG signals have been widely used as useful indicators of different mental states such as epilepsy, Alzheimer’s disease (AD) and vascular dementia (VaD) [31,32,33,34,35].
As the brain is a complex structure that has a dynamic behavior, electrical activities including emotional states, can be reflected by using EEG. EEG is a neurophysiological tool used to monitor and identify brain changes [36]. EEG is a widely available, cost-effective, and non-invasive tool that tracks information processing with milliseconds precision and high temporal resolution [36,37]. A typical clinical EEG frequency ranges from 0.01 Hz to approximately 70 Hz [38]; the corresponding waveforms have an amplitude of a few µVolt to approximately 100 µVolt [39]. EEG background waveforms also convey valuable information. Thus, these waveforms can be classified into five specific frequency power bands: the delta band ( δ ), the theta band ( θ ), the alpha band ( α ), the beta band ( β ), and the gamma band ( γ ) [40,41]. Studies on EEG signal processing have been conducted to identify the brain activity patterns involved in cognitive science, neuropsychological research, clinical assessments, and consciousness research [42,43,44,45,46,47]. Recently, EEG has been widely used to assess and evaluate the human emotional states with excellent time resolution [3,15,28,29,30,48]. EEG can provide useful information of emotional states that have been described as a potential biomarker to evaluate different emotional responses from multi-channel EEG datasets over the brain regions [38]. A key advantage of the multi-channel EEG signal processing is to interpret EEG changes during different emotional states over the brain regions. Thus, numerous studies have been performed including this study to deal with issue [49,50,51]. For instance, Nattapong et al. have proposed a continuous music-emotion recognition approach based on brain wave signals [52]. Olga et al. have applied a combined music therapy process with the real-time EEG-based human emotion recognition algorithm to identify the current emotional state based on neurofeedback and adjust the music therapy based on the patient’s needs [53].
As the brain neurons are controlled by linear and non-linear phenomena, several linear techniques including traditional spectral powers have been used to analyze the smoothness of the EEG as a time series and to elicit the emotional information from the EEG signals. The higher order statistics (HOS) features, namely skewness and kurtosis, have been applied as well to measure the presence of transients in the signal [52,54,55,56]. Due to the capability of the brain to perform sophisticated emotional tasks and to investigate the complex dynamic information that is reflected from the brain cortex, several researchers have used non-linear methods for automatic detection of emotions through EEG signals [57]. Previous emotion studies have used a small number of features, mostly relative powers [3], Hurst [15], Hjorth parameters [58], Fractal Dimension (FD) [59], and statistical features [60,61]. Moreover, entropy has been considered as the most prevalent methods to evaluate the presence or absence of long-range dependence on physiological signal analysis including approximate entropy ( A p E n ), sample entropy ( S a m p E n ) and permutation entropy ( P e r E n ) which are relatively robust to noise and powerful enough to quantify the complexity of a time series [62]. Amplitude-aware permutation entropy ( A A P E ) has demonstrated efficiency in discriminating between calmness and distress [63,64,65]. Fuzzy entropy ( F u z E n ) was proposed for EEG analysis in [66,67]. S a m p E n is slightly faster than F u z E n , however the latter is more consistent and less dependent on the data length [68]. Azami et al. has considered the advantages of F u z E n over S a m p E n and recently has introduced refined composite multiscale F u z E n ( R C M F E ) [68,69,70]. In R C M F E , the entropy stability is improved, the signals’ length sensitivity is reduced and the coarse-grained process of R C M F E smoothens the signals. Hence, R C M F E has been used in this study.
EEG signal contains useful information on physiological states of the brain and has proven to be a potential biomarker to realize the linear and non-linear behavior of the brain [31,32,71,72,73]. Therefore, motivation of this work is twofold. First, in order to investigate alternate information from multi-channel emotional EEG datasets, linear spectral conventional analysis, non-linear entropy method and temporal feature were performed to obtain the potential EEG emotional biomarkers. Second, the obtained biomarkers may be further considered to provide additional information to illustrate the EEG spectro-spatial ( S S ) , entropy-spatial ( E S ) and temporo-spatial ( T S ) profiles for the seven emotional states over the brain regions.
The preprocessing stage has been used to limit the unwanted frequency from the EEG dataset. Spectral biomarkers were computed by employing the absolute powers ( A b s P )   of δ , θ , α , β , and γ . Moreover, to quantify the complexity of brain functions, entropy biomarkers across multi-channel EEG signals have been measured using R C M F E . Furthermore, the temporal biomarker was reported by amplitude envelope which was extracted using Hilbert transform, and the amplitude values were investigated by skewness ( S k w ) to get H S k w . Three stages of two-way analysis of variance (ANOVA) were conducted to obtain the spectral, entropy and temporal biomarkers followed by Pearson’s correlations to get the spatial profiles that are related to anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the brain regions. The valance-arousal circumplex model was employed in this study to represent and recognize human emotions due to its effectiveness in viewing the emotions while audio-visual video clips were used [74]. To the author’s best knowledge this study has two contributions: firstly, it is the first use of a combination of certain features to develop spectral, entropy and temporal biomarkers towards S S , E S and T S profile identification for the seven emotional states over the brain regions; secondly, the EEG elicitation protocol and EEG measurement procedure have never been used before for emotion data acquisition.

2. Materials and Methods

This study is intended to be focused on the potential EEG emotional biomarkers and profiles that obtained from EEG datasets. Figure 1 illustrates the block diagram of the proposed study.

2.1. EEG Acquisition and Recording

A transportable Emotiv EPOC EEG 14-channel headset (Emotive Systems, Inc., San Francisco, CA, USA) was adopted in order to evaluate 14 EEG electrodes (AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4) overall with 2 ground electrodes which were provided by the driven right leg (DRL) mastoid and common mode sense (CMS) left mastoid. The Emotiv EPOC EEG uses sponge-based electrodes which were located based on the 10–20 system. The electrode information was filtered through a 0.5–70 Hz band-pass filter. A 128 Hz sampling frequency was used with a resolution of 0.51 mV.
40 university students agreed to participate in this research (Table 1). A subject appraisal was carried out for each individual to guarantee that no previous psychiatric or neurological problems had been suffered, with the participant then providing their agreement to participation through signing an informed consent document, before the study proceeded. Subjects were presented with different brief film excerpts alongside audio that aimed to be emotionally engaging, after which a self-assessment questionnaire (SAQ) was filled in by the subjects to provide their assessment and scoring of emotional reactions to the excerpts. The subsequent video excerpt was presented after a 45 s pause. This process is presented in Figure 2 [75].
Respondents rated their responses in the SAQ according to the level of emotion felt, from 5 = very high; 4 = high; 3 = medium; 2 = low, to 1 = very low, thus providing a five-point scale [48]. This enabled the neutral circumstances and six affective states—anger, anxiety, sadness, disgust, surprise and happiness. Rottenberg’s suggestions were followed in order to identify appropriate affective film excerpts [75]; one film excerpt had a duration of four minutes, which was the longest, with the others differing in length.
In order to assist with the presentation of the affective video excerpts to the subjects, the University of Vienna’s virtual emotion presenter program was applied. Further information source documentation and arbitrary presentation is permitted through the program. The anthropology research laboratory was the location of the research; the sound for the film was played to the subjects at a consistent and reasonable volume through a stereo system; film excerpts were viewed on an LCD display; the laboratory had consistent natural light sources, with the VEP also adopted as explained above. As the 40 subjects viewed the affective video excerpts, monitoring of the EEG electrode signals ensued.

2.2. Preprocessing Stage

Brain responses and artefacts may have intersected due to the latter residing within the frequency bands of the EEG waves. In terms of EEG signal preprocessing, a significant aspect is noise eradication. Standard filtering may be an aspect of the preprocessing phase, with EEG signals seeing the introduction of further software filters—band pass and notch filters for example—to carry out this process. As a means of restricting EEG signal frequencies in accordance with [76], a higher cutoff frequency around 64 Hz and a low cutoff of 0.5 Hz (3 dB) was adopted for the band pass filter. A 50 Hz cutoff frequency was adopted for the notch filter; eliminating A/C electricity line interference is the typical reason for doing so [38]. Three 10 s trials per video excerpt comprised the overall video, with every 10 s trial comprising of 1280 information points, in order to carry out additional filtered EEG data processing.

2.3. Features Extraction

Comprehending various affective states’ associations is assisted by EEG signals as a significant source of brain function data. In terms of identifying particular emotional actions, the EEG signal offers various quantifiable measures. Accordingly, affective EEG biomarkers were derived from a number of characteristics, primarily distinguished into temporal, entropy and spectral power characteristics, as a means of determining the principal characteristics that enable the EEG data to be matched with affective states, while also allowing improved explication of the brain areas’ altering affective states. Additionally, the brain areas’ and seven affective states’ spectro-spatial ( S S ) , entropy-spatial ( E S ) and temporo-spatial ( T S ) profiles were determined through combining the quantified biomarkers via Pearson’s correlations.

2.3.1. Spectral Biomarker

Through investigating the impact on different brain areas of various multi-channel EEG signals’ frequency bands, such spectral assessments as a linear characteristic for appraising affective alterations have been pervasively adopted. Multi-channel EEG alterations have been quantified via the AbsP characteristic as aspects of brain rhythms. Meanwhile, Welch’s technique was applied to calculate the EEG information’s power spectral density (PSD), with the specific frequency bands of gamma ( γ : 32 ≤ f ≤ 60) Hz, beta ( β : 16 ≤ f ≤ 32) Hz, alpha ( α : 8 ≤ f ≤ 16) Hz, theta ( θ : 4 ≤ f ≤ 8) Hz, and delta ( δ : 0 ≤ f ≤ 4) Hz being distinguished as particular frequency bands for the EEG signals’ PSD [77]. A single band’s level of EEG activity autonomous of different bands’ activity is indicated by the A b s P , with Equation (1) adopted to determine it [78].
A b s P ( % ) = 10 × log 10   Selected   frequency   band     Total   range   ( 0.5 64   Hz )
Every film excerpt’s last 30 s were divided into three 10 s parts comprising of 1280 information points per part, providing 3840 information points overall from which the EEG signal information’s A b s P features were ascertained.

2.3.2. Entropy Biomarker

The obtained EEG signals, inclusive of the R C M F E , have been assessed by applying the non-linear entropy method, given that complex mental procedures may be undertaken by the brain.
In order to calculate the R C M F E based on mean ( R C M F E μ ) for 1     u     τ , z u τ = { y u , 1 ( τ ) ,   y u , 2 ( τ ) ,   } , where
μ y u , j ( τ ) = b = u + τ ( j + 1 ) u + τ j 1 x b τ
For a defined scale factor τ and embedding dimension, m , τ , k m | ( k = 1 , , τ ) and τ , k m + 1 | ( k = 1 , , τ ) for each of z k τ | ( k = 1 , , τ )   are separately calculated. Next, the average of values of τ , k m and τ , k m + 1 on 1     k     τ are computed, respectively. Finally, the R C M F E is computed as in Equation (2):
R C M F E ( X ,   τ , m ,   n ,   r )   =   ln ( ¯ τ m + 1 / ¯ τ m )
The embedding dimension m , R C M F E power n , and tolerance r   for all of the approaches were respectively chosen as τ = 1 , m = 3 , n = 2 , r   =   0.1 ~ 0.2 S D , and S D   is the standard deviation of the original time series [68].

2.3.3. Temporal Biomarker

In contrast with alternative brain imaging approaches, greater temporal resolution and altered temporal changeability over a particular time period are provided by EEG signals, which provide its clinical advantage. Accordingly, precise millisecond readings of electro-physiological alterations may be derived from EEG. Consequently, temporal data analysis enables the formulation of EEG biomarkers.
The Hilbert transformation’s adoption enables the application of the amplitude envelope to define the temporal biomarkers. Skewness ( S k w ) was calculated to get H S k w in relation to the distribution for every EEG channel, once the amplitude envelope had been established.
Therefore, to compute the temporal biomarkers set, first the EEG signal X k ( n ) , for channel k and n is the time-domain index, the temporal envelope is then extracted using the Hilbert transform H { . }   as in Equation (3) [79].
e i ( n ) = X k ( n ) 2 + H { X k ( n ) } 2
Let m n = E { ( x E { x } ) n } be the n t h central moment of the H S k w distributions. The S k w is defined as in Equation (4).
S k w = m 3 ( m 2 ) 3 / 2
S k w is the normalized 3rd order moment of amplitude distribution. If the distribution is symmetrical, then S k w is zero. By contrast, large S k w values are associated with the asymmetry degree of amplitude distribution [80,81,82].

2.4. Statistical Analysis

Enhanced comprehension of brain states is a requisite outcome of the approach taken to the EEG dataset’s mapping. IBM USA’s SPSS program version 25 was adopted to undertake statistical analysis. Resultantly, four recording areas relating to the cerebral cortex’s scalp region formed the basis of the initial categorization of the 40 fit participants’ EEG dataset. The dimension of the feature matrix was (40 subjects × 14 EEG Channels × 7 emotional states) = 3920 attributes for each of spectral, entropy and temporal biomarkers. The different brain regions’ alternative affective states’ profiles and affective biomarkers could be directly conveyed by the divergences in brain areas, facilitated by the mean characteristics of the area. The area mean’s derived characteristics were used to categorize the various brain regions’ differences, for example occipital (O1 and O2 channels), parietal (P7 and P8 channels), temporal (T7 and T8 channels) and frontal (AF3, F7, F3, FC5, F4, FC6, F8, and AF4 channels). Subsequently, Levene’s test for homoscedasticity was applied, and the Kolmogorov–Smirnov test was performed to test the normality assumption required by the ANOVA statistical test. Two methods of statistical analysis were applied. One established the extent to which brain areas had varying affective states in relation to temporal, entropy and spectral characteristics, namely the analysis of variance (ANOVA) test. The brain areas’ various connectivity characteristics were established through Pearson’s correlation.

2.4.1. ANOVA

There were three aspects to the ANOVA test. Firstly, the distinctive characteristics were subject to a two-way ANOVA test; the dependent variable related to the spectral biomarker and was the AbsP characteristic, while the independent variables were the four brain areas (occipital, parietal, temporal and frontal) as well as the seven emotional states (anger, anxiety, disgust, happiness, sadness, surprise and neutral).
Secondly, the R C M F E characteristic was subjected to the two-way ANOVA test, with the independent variables being the brain areas and seven affective states, while the entropy biomarker was the dependent variable.
Thirdly, the dependent variable of the seven affective states’ amplitude envelope distributions’ H S k w was subject to the two-way ANOVA test, with the temporal biomarker’s independent variable being the seven affective states and four brain areas.
Duncan’s test was applied in order to provide the post hoc contrast, with p ˂ 0.05 established as each statistical assessments’ level of significance. Resultantly, the seven affective states and the brain areas’ possible temporal, entropy and spectral biomarkers are conveyed in this part. The Bonferroni post hoc test has been conducted to examine multiple comparisons for each group of tests, including the seven emotional states and the four brain regions.

2.4.2. Pearson’s Correlations

As a means of analyzing and revealing the spatial variability and distribution changes in three different ways along the EEG signals’ length, the spectral, entropy and temporal biomarkers obtained during the previous section will be integrated into the brain spatial information, thus enabling an appropriate understanding of emotional significance. Consequently, three stages of Pearson’s correlation were implemented for developing spectro-spatial ( S S ) , entropy-spatial ( E S ) and temporo-spatial ( T S ) profiles. These patterns offered a concise, consolidated method of EEG profile representation over the brain regions relating to anger, anxiety, disgust, happiness, sadness, surprise and neutral emotions.
During each of the three sessions, Pearson’s correlation coefficient ( r ) was calculated so as to establish the biomarkers’ correlations, including for the neutral emotional state and six emotion states. Each correlation analysis under the Pearson’s correlation method was calculated at p   <   0.05 ,   reflecting statistical significance. All correlation sessions were implemented for every participant.
Determination of every specific affective state’s S S profile— ( S S a n g e r ) for anger, ( S S a n x i e t y ) for anxiety, ( S S d i s g u s t ) for disgust, ( S S h a p p i n e s s ) for happiness, ( S S s a d n e s s ) for sadness, ( S S s u r p r i s e ) for surprise)—as well as S S n e u t r a l for the neutral affective state, was undertaken during Pearson’s correlation’s initial application.
A 2nd session of Pearson’s correlation, the E S profile— ( E S a n g e r ) for anger, ( E S a n x i e t y ) for anxiety, ( E S d i s g u s t ) for disgust, ( E S h a p p i n e s s ) for happiness, ( E S s a d n e s s ) for sadness, ( E S s u r p r i s e ) for surprise—as well as E S n e u t r a l for the neutral affective state, were computed.
A 3rd session of Pearson’s correlation, the T S profile— ( T S a n g e r ) for anger, ( T S a n x i e t y ) for anxiety, ( T S d i s g u s t ) for disgust, ( T S h a p p i n e s s ) for happiness, ( T S s a d n e s s ) for sadness, ( T S s u r p r i s e ) for surprise)—as well as T S n e u t r a l for the neutral affective state, were obtained.

3. Results

An overall duration of 3840 information points over 30 s intervals for the 14 EEG channels was subject to characterization. For the seven specific affective activities, the EEG recordings were distinguished into 10 s parts with 1280 information points being the duration per section. The statistical analysis methods of ANOVA and Pearson’s correlation were applied to determine the character extraction findings.

3.1. ANOVA Results

The subsequent parts explore the four brain areas (occipital, parietal, temporal and frontal) in relation to the seven affective states (anger, anxiety, disgust, sadness, surprise, happiness and neutral) in relation to the spectral, entropy and temporal biomarker statistical features.
Figure 3 presents the spectral biomarker performance corresponding to each individual emotional state across the brain regions. It is apparent that the frontal lobes presented the most significant activity compared with other brain lobes ( S p e c t r a l F r o n t a l > S p e c t r a l T e m p o r a l > S p e c t r a l O c c i p i t a l > S p e c t r a l P a r i e t a l ) . The highest means were attained for S p e c t r a l n e u t r a l , which varied significantly from all other emotional states across each brain region apart from S p e c t r a l s u r p r i s e . Moreover, the S p e c t r a l a n g e r response was significantly different from S p e c t r a l s u r p r i s e   and S p e c t r a l n e u t r a l , given that both anger and surprise emotions were situated in the upper-right quadrant and upper-left quadrant of the valance-arousal circumplex model respectively. S p e c t r a l a n x i e t y was significantly different from S p e c t r a l s a d n e s s , S p e c t r a l s u r p r i s e   and S p e c t r a l n e u t r a l , with anxiety and sadness located in the upper- and lower-right quadrants of the valance-arousal circumplex model respectively. Meanwhile, surprise was located in the upper-left quadrant of the valance-arousal circumplex model. The significant differences were established at p < 0.05   level of significance.
The Bonferroni post hoc test has been conducted to examine multiple comparisons. Table 2 shows the post-hoc emotion multiple comparisons using Bonferroni adjustments for Spectral Biomarker. The post hoc tests using the Bonferroni correction revealed that neutral was statistically significant from sadness and happiness respectively ( p   =   0.004 ,   0.05 ), anger was statistically significant from sadness and happiness ( p   =   0.05 ,   0.05 ), anxiety was statistically significant from sadness and happiness respectively ( p   =   0.019 ,   0.002 ), sadness was statistically significant from surprise ( p   =   0.001 ) and surprise was statistically significant from happiness ( p   =   0.05 ). Moreover, from Table 3, the brain region multiple comparisons using Bonferroni adjustments for Spectral Biomarker have been illustrated. The frontal region was statistically significant from temporal, parietal and occipital brain regions ( p   =   0.05 ).
Secondly, ANOVA was conducted as a comparative study to check the performance of R C M F E entropy biomarkers. The significant differences among the entropy biomarker were evaluated over the four brain regions. The significances were set at p ˂ 0.05. The temporal lobes have the highest mean and they were significantly different from other brain lobes for all emotional states. From the visual inspection of Figure 4, it can be observed that the highest entropy values were noted for the all emotional states ( E n t r o p y T e m p o r a l > E n t r o p y P a r i e t a l > E n t r o p y O c c i p i t a l > E n t r o p y F r o n t a l ) . From the visual inspection of Figure 4, it can be observed that the highest entropy values were noted for the E n t r o p y a n x i e t y has highest mean which was significantly differenced from all emotional states except for E n t r o p y s u r p r i s e . The response of E n t r o p y n e u t r a l was significantly different from E n t r o p y a n x i e t y and E n t r o p y s u r p r i s e as both anxiety and surprise were located at the upper right and upper left quadrant of the valance-arousal circumplex model, respectively. The significant differences were set at (   p < 0.05 ) .
For the entropy biomarkers, Table 4 shows the post hoc emotion multiple comparisons using Bonferroni adjustments for the entropy biomarker. The post hoc tests using the Bonferroni correction revealed that anger was statistically significant from disgust and happiness respectively ( p   =   0.001 ,   0.05 ) and sadness was statistically significant from happiness ( p   = 0.0035 ). Moreover, from Table 5, the brain region multiple comparisons using Bonferroni adjustments for the entropy biomarker have been illustrated. The frontal region was statistically significant from temporal, parietal and occipital brain regions ( p   = 0.05 ).
Thirdly, ANOVA was conducted as a comparative study to check the performance of temporal biomarkers which have been characterized by the amplitude envelope parameters using H S k w as temporal biomarkers. The significant differences among the temporal biomarkers were evaluated over the 4 brain regions. The significances were set at p ˂ 0.05. Figure 5 shows the temporal biomarkers of the emotional responses among the brain regions. The frontal lobes have the most significant activity in comparison to other brain lobes: ( T e m p o r a l F r o n t a l > T e m p o r a l O c c i p i t a l > T e m p o r a l P a r i e t a l > T e m p o r a l T e m p o r a l ) . The response to the T e m p o r a l s a d n e s s   has the highest mean. T e m p o r a l a n g e r and T e m p o r a l s u r p r i s e almost have the same mean, T e m p o r a l n e u t r a l and T e m p o r a l a n x i e t y almost have the same performance and finally T e m p o r a l d i s g u s t and T e m p o r a l h a p p i n e s s have the same effect and that related to their distribution within the valance-arousal circumplex model. The significant differences were set at p < 0.05 .
For the temporal biomarkers, Table 6 shows the post hoc emotion multiple comparisons using Bonferroni adjustments for the temporal biomarker. The post hoc tests using the Bonferroni correction revealed that it were not statistically different ( p   > 0.05 ) for the seven emotional states.. Moreover, from Table 7, the brain region multiple comparisons using Bonferroni adjustments for the spectral biomarker have been illustrated. The frontal region was statistically significant from temporal, parietal and occipital brain regions respectively ( p   = 0.05 ,   0.05 ,   0.001 ).

3.2. Results of Pearson’s Correlations

During the second statistical analysis stage, Pearson’s correlation coefficients were calculated relating to the spectral, entropy and temporal biomarkers for the neutral state as well as six emotional states (anger, anxiety, disgust, happiness, sadness and surprise) per each EEG channel for the frontal, temporal, parietal and occipital brain regions. Significant differences were calculated as existing between the various emotions with regard to EEG-based correlation alterations.
The correlations of S S n e u t r a l S S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e were significantly positive in all cases, as Figure 6 presents. For example, the frontal region S S n e u t r a l showed a very strong positive correlation especially with S S a n x i e t y   ( r   = 0.880, p ˂ 0.01), S S s a d n e s s ( r = 0.866, p ˂ 0.01) and S S a n g e r   ( r = 0.857, p ˂ 0.01). Furthermore, temporal area S S n e u t r a l had a very strong positive correlation particularly with S S a n x i e t y   ( r   = 0.894, p   ˂ 0.01), S S a n g e r ( r = 0.866, p ˂ 0.01) and S S h a p p i n e s s ( r = 0.805, p ˂ 0.01). S S n e u t r a l expressed a very strong positive correlation with S S s a d n e s s , S S s u r p r i s e   and S S a n g e r ( r = 0.881, r = 0.861,   r = 0.842, p   ˂ 0.01), respectively, in the parietal region. Moreover, S S n e u t r a l had a very strong positive correlation with S S a n x i e t y , S S s a d n e s s   and S S a n g e r   ( r   = 0.861, r = 0.827, r = 0.861, p ˂ 0.01) in the occipital region. Overall, S S n e u t r a l and S S a n g e r had the highest correlation in the temporal region ( r = 0.866, p ˂ 0.01). S S n e u t r a l and S S a n x i e t y had the highest correlation in the temporal region and frontal regions ( r   = 0.894, r   = 0.880, p   ˂ 0.01). S S n e u t r a l and S S h a p p i n e s s had the highest correlation in the temporal region ( r   = 0.805, p   ˂ 0.01). S S n e u t r a l and S S d i s g u s t had the highest correlation in the temporal region ( r   = 0.616, p ˂ 0.01). S S n e u t r a l and S S s a d n e s s had the highest correlation in the parietal region ( r = 0.881, p ˂ 0.01). S S n e u t r a l and S S s u r p r i s e had the highest correlation in the parietal region ( r   = 0.861, r = 0.970, p   ˂ 0.01). Regarding the S S profile, the lowest positive correlation was observed between S S n e u t r a l and S S d i s g u s t in the occipital and parietal regions ( r   = 0.461,   r   = 0.489, p   ˂ 0.01) respectively. Accordingly, regarding the S S emotional profile, the frontal, temporal and parietal lobes participated to the greatest extent in emotional elicitation.
The correlation of E S n e u t r a l E S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e had significant positive correlations in all cases, as shown in Figure 7. For instance, in the frontal region, E S n e u t r a l had a very strong positive correlation particularly with E S a n x i e t y   ( r = 0.684, p ˂ 0.01) and E S s a d n e s s ( r = 0.683, p ˂ 0.01). It can be observed that for the temporal region that the E S n e u t r a l had very strong positive correlation particularly with E S a n x i e t y   ( r = 0.707, p ˂ 0.01) and E S s a d n e s s ( r = 0.633, p ˂ 0.01). For the parietal region, E S n e u t r a l had a strong positive correlation particularly with E S a n x i e t y   ( r = 0.608, p ˂ 0.01). Moreover, E S n e u t r a l had a very strong positive correlation with E S a n x i e t y   ( r = 0.693, p ˂ 0.01) at the occipital region. In other words E S n e u t r a l and E S a n g e r had the highest correlation at the frontal region ( r = 0.621, p ˂ 0.01). E S n e u t r a l and E S a n x i e t y had the highest correlation at the temporal region ( r = 0.707, p ˂ 0.01). E S n e u t r a l and E S d i s g u s t had the highest correlation at temporal region ( r = 0.606, p ˂ 0.01). E S n e u t r a l and E S h a p p i n e s s had the highest correlation at frontal region ( r = 0.533, p ˂ 0.01). E S n e u t r a l and E S s a d n e s s had the highest correlation at the frontal region ( r = 0.688, p ˂ 0.01). E S n e u t r a l and E S s u r p r i s e had the highest correlation at the parietal region ( r = 0.592, p ˂ 0.01). For the E S profile the lowest positive correlation can be seen between E S n e u t r a l and E S h a p p i n e s s at the temporal region ( r = 0.456, p ˂ 0.01). Therefore, for the E S emotional profile the frontal lobes were mostly participating in anger, happiness and sadness emotional states, whereas the temporal lobes were responsible for anxiety and disgust emotional elicitation.
The correlation of T S n e u t r a l T S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e is shown in Figure 8. For instance, in the frontal region, T S n e u t r a l had a moderate positive correlation particularly with T S s a d n e s s   ( r = 0.509, p ˂ 0.01). It can be observed that for the temporal area the T S n e u t r a l had a moderate correlation particularly with T S s a d n e s s   ( r = 0.506, p ˂ 0.01). T S n e u t r a l had a moderate positive correlation with T S a n g e r ( r = 0.402, p ˂ 0.01) at the parietal region, respectively. Moreover, T S n e u t r a l had a moderate positive correlation with T S d i s g u s t   ( r = 0.598, p ˂ 0.01) at the occipital region. In other words T S n e u t r a l and T S a n g e r had the highest correlation at the parietal region ( r = 0.402, p ˂ 0.01). T S n e u t r a l and T S a n x i e t y had the highest correlation at the frontal region ( r = 0.300, p ˂ 0.01). T S n e u t r a l and T S d i s g u s t had the highest correlation at the occipital region ( r = 0.598, p ˂ 0.01). T S n e u t r a l and T S h a p p i n e s s had the highest correlation at the occipital region ( r   = 0.377, p ˂ 0.01). T S n e u t r a l and T S s a d n e s s had the highest correlation at the temporal region ( r = 0.560, p ˂ 0.01). T S n e u t r a l and T S s u r p r i s e had the highest correlation at the occipital region ( r = 0.417, p ˂ 0.01). For the T S profile the lowest positive correlation was observed between T S n e u t r a l and T S h a p p i n e s s at the temporal region ( r = 0.03, p   ˂ 0.01). Therefore, for the T S emotional profile, the frontal, temporal and occipital lobes were mostly participating in emotional elicitation.

4. Discussion

EEG’s utility as a clinical tool for analyzing functional changes associated with different emotional states (anger, anxiety, disgust, happiness, sadness, surprise and neutral) across different brain areas (frontal, temporal, parietal and occipital scalp) is of considerable interest. In this regard, the research has established a novel conceptual connection between the S S , E S and T S profiles and the aforementioned emotional states across the brain regions. Conventional filters were employed to provide a preprocessing stage. A total of 14 channels across the various scalp regions were recorded while participants viewed seven brief emotional audio-visual video clips. The various domain features during this research, including spectral, entropy and temporal features, were computed so as to illustrate key EEG biomarkers relating to several emotional states. To provide more in-depth investigation, S S , E S and T S EEG emotional profiles were developed through the multivariate addition of spectral, entropy and temporal characteristics to spatial information. Overall, from the visual inspection of the spectral and temporal biomarkers, it was found that the frontal regions are particularly responsible for emotion detection while experiencing anger and anxiety in the upper-right quadrant of the valence-arousal circumplex model, whereas sadness and disgust appear in the lower-right left quadrant of the valence-arousal circumplex model. Surprise and happiness were situated in the upper-left quadrant of the valence-arousal circumplex model. The entropy biomarkers evidenced that the temporal regions were especially activated in the detection of emotion while experiencing anxiety and surprise, in the valence-arousal circumplex model’s upper-right and upper-left quadrants respectively. Table 8 presents the most highly correlated emotions with neutral for the S S , E S and T S profiles across the four brain regions. It was evidenced that the frontal, temporal, parietal and occipital lobes are primarily responsible for anger, anxiety and sadness elicitations. Emotions such as surprise are detectable in the frontal and parietal lobes whereas happiness and disgust may be elicited from the temporal and occipital lobes respectively. Accordingly, such findings imply that the combination of spectral, entropy and temporal feature sets could provide and convey more reliable biomarkers as a means of identifying S S , E S and T S profiles for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states across the frontal, temporal, parietal and occipital scalp brain areas. The S S profile is significant in representing how anger emotions correspond to all brain lobes, while the E S profile is significant for representing anxiety emotions. Additionally, the T S profile is important for representing the sadness emotions.
Regarding the neuro-scientific perspective, all of the obtained results are consistent with the frontal, temporal, parietal and occipital brain lobes’ principal functions. The frontal lobe is deemed to be the emotional control center [83,84], while temporal lobes are linked to emotional perception [85]. Resultantly, to obtain greater insight into EEG emotional states, we incorporated several features from spectral, entropy and temporal aspects, enabling the identification of the most reliable EEG emotional biomarkers, as well as the development of the S S , E S and T S profiles as benchmarks for deeper inspection.
To sum up, emotions play a critical role in our day-to-day lives. Emotion investigation can gain a deeper understanding of human complex behavior. Emotions like happiness are considered as positive emotions that have been linked to a variety of outcomes including increased longevity and increased marital satisfaction [82]. Conversely, anger, anxiety and sadness are often thought of as negative emotions that have been linked to decreased life expectancy and may even have an impact on physical health [83,84]. Therefore, to capture and characterize people’s everyday emotional experiences, many recent scientific works validate the use of EEG as a diagnostic tool that is widely used in everyday life. So far, spectral, entropy and temporal biomarkers and S S , E S and T S EEG emotional profiles might be valuable physiological information that help in improving emotional investigation procedure.

5. Conclusions

In this study, EEG has been adopted for eliciting information in terms of waveform distribution over the scalp. The spectral, entropy and temporal biomarkers for emotion identification have been performed. These biomarkers were integrated to develop S S , E S and T S emotional profiles over the brain regions. The EEGs of 40 healthy volunteer students from the University of Vienna were recorded while they viewed seven brief emotional video clips. ANOVA has been conducted to identify the emotional biomarkers and Pearson’s correlations have been employed to determine the EEG emotion profiles. The results evidence that the combination of applied spectral, entropy and temporal sets of features may provide and convey reliable biomarkers for identifying S S , E S and T S profiles relating to different emotional states over the brain areas. EEG biomarkers and profiles enable more comprehensive insights into various human behavior effects as an intervention on the brain.

Author Contributions

N.K.A.-Q.: Analysis, and interpretation of the EEG data for the work; drafting the manuscript. M.K.S.: Revising the work critically for important intellectual content. S.H.B.M.A.: Revising the work; support the article by fund. S.A.A.: Revising the work critically for important intellectual content. K.G.: Acquisition the EEG signals. All authors have read and agreed to the published version of the manuscript.

Acknowledgments

This research was partially supported by Universiti Kebangsaan Malaysia and Ministry of Education, Malaysia, Grant Code FRGS/1/2018/TK04/UKM/02/2.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jerritta, S.; Murugappan, M.; Khairunizam, W.; Yaacob, S. Electrocardiogram-based emotion recognition system using empirical mode decomposition and discrete Fourier transform. Expert Syst. 2014, 31, 110–120. [Google Scholar]
  2. Xu, T.; Zhou, Y.; Wang, Z.; Peng, Y. Learning emotions EEG-based recognition and brain activity: A survey study on BCI for intelligent tutoring system. Procedia Comput. Sci. 2018, 130, 376–382. [Google Scholar] [CrossRef]
  3. Murugappan, M.; Ramachandran, N.; Sazali, Y. Classification of human emotion from EEG using discrete wavelet transform. J. Biomed. Sci. Eng. 2010, 3, 390. [Google Scholar] [CrossRef] [Green Version]
  4. Bos, D.O. EEG-based emotion recognition. Influ. Vis. Audit. Stimuli 2006, 56, 1–17. [Google Scholar]
  5. Zaki, J.; Davis, J.I.; Ochsner, K.N. Overlapping activity in anterior insula during interoception and emotional experience. Neuroimage 2012, 62, 493–499. [Google Scholar] [CrossRef]
  6. Barrett, L.F. Discrete emotions or dimensions? The role of valence focus and arousal focus. Cogn. Emot. 1998, 12, 579–599. [Google Scholar] [CrossRef]
  7. Ekman, P. An argument for basic emotions. Cogn. Emot. 1992, 6, 169–200. [Google Scholar] [CrossRef]
  8. Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
  9. Wang, X.-W.; Nie, D.; Lu, B.-L. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129, 94–106. [Google Scholar] [CrossRef]
  10. Mauss, I.B.; Robinson, M.D. Measures of emotion: A review. Cogn. Emot. 2009, 23, 209–237. [Google Scholar] [CrossRef]
  11. Soroush, M.Z.; Maghooli, K.; Setarehdan, S.K.; Nasrabadi, A.M. Emotion classification through nonlinear EEG analysis using machine learning methods. Int. Clin. Neurosci. J. 2018, 5, 135–149. [Google Scholar] [CrossRef]
  12. Kensinger, E.A. Remembering emotional experiences: The contribution of valence and arousal. Rev. Neurosci. 2004, 15, 241–252. [Google Scholar] [CrossRef] [PubMed]
  13. Maaoui, C.; Pruski, A. Emotion recognition through physiological signals for human-machine communication. In Cutting Edge Robotics 2010; Vedran, K., Alex, L., Munir, M., Eds.; Intech Open: London, UK, 2005; Available online: https://www.intechopen.com/books/cutting-edge-robotics-2010/emotion-recognition-through-physiological-signals-for-human-machine-communication (accessed on 1 October 2010).
  14. Shaheen, R.; Coifman, K.; Flynn, J.; Matt, L.; Halachoff, D. A film set for the elicitation of emotion in research: A comprehensive catalog derived from four decades of investigation. Behav. Res. methods 2017, 6, 2061–2082. [Google Scholar]
  15. Selvaraj, J.; Murugappan, M.; Wan, K.; Yaacob, S. Classification of emotional states from electrocardiogram signals: A non-linear approach based on hurst. Biomed. Eng. Online 2013, 12, 44. [Google Scholar] [CrossRef] [Green Version]
  16. Schaefer, A.; Nils, F.; Sanchez, X.; Philippot, P. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cogn. Emot. 2010, 24, 1153–1172. [Google Scholar] [CrossRef]
  17. Ping, H.Y.; Abdullah, L.N.; Halin, A.A.; Sulaiman, P.S. A study of physiological signals-based emotion recognition systems. Int. J. Comput. Technol. 2013, 11, 2189–2196. [Google Scholar] [CrossRef]
  18. Kushki, A.; Fairley, J.; Merja, S.; King, G.; Chau, T. Comparison of blood volume pulse and skin conductance responses to mental and affective stimuli at different anatomical sites. Physiol. Meas. 2011, 32, 1529. [Google Scholar] [CrossRef] [Green Version]
  19. Kuraoka, K.; Nakamura, K. The use of nasal skin temperature measurements in studying emotion in macaque monkeys. Physiol. Behav. 2011, 102, 347–355. [Google Scholar] [CrossRef]
  20. Al-Zidi, M.G.; Santhosh, J.; Ng, S.C.; Bakar, A.R.A.; Ibrahim, I.A. Cortical auditory evoked potentials as indicators of hearing aids performance in speech perception. J. Eng. Res. 2017, 5, 76–94. [Google Scholar]
  21. Agrafioti, F.; Hatzinakos, D.; Anderson, A.K. ECG pattern analysis for emotion detection. IEEE Trans. Affect. Comput. 2011, 3, 102–115. [Google Scholar] [CrossRef]
  22. Künecke, J.; Hildebrandt, A.; Recio, G.; Sommer, W.; Wilhelm, O. Facial EMG responses to emotional expressions are related to emotion perception ability. PLoS ONE 2014, 9, e84053. [Google Scholar] [CrossRef] [PubMed]
  23. Vecchiato, G.; Toppi, J.; Astolfi, L.; Fallani, F.D.V.; Cincotti, F.; Mattia, D.; Bez, F.; Babiloni, F. Spectral EEG frontal asymmetries correlate with the experienced pleasantness of TV commercial advertisements. Med. Biol. Eng. Comput. 2011, 49, 579–583. [Google Scholar] [CrossRef] [PubMed]
  24. Coan, J.A.; Allen, J.J. Frontal EEG asymmetry as a moderator and mediator of emotion. Biol. Psychol. 2004, 67, 7–50. [Google Scholar] [CrossRef] [PubMed]
  25. Di Flumeri, G.; Aricò, P.; Borghini, G.; Sciaraffa, N.; Maglione, A.G.; Rossi, D.; Modica, E.; Trettel, A.; Babiloni, F.; Colosimo, A. EEG-based approach-withdrawal index for the pleasantness evaluation during taste experience in realistic settings. In Proceedings of the 2017 39th annual international conference of the IEEE engineering in medicine and biology society (EMBC), Seogwipo, Korea, 11–15 July 2017; pp. 3228–3231. [Google Scholar]
  26. Zheng, W.-L.; Lu, B.-L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
  27. Alarcao, S.M.; Fonseca, M.J. Emotions recognition using EEG signals: A survey. IEEE Trans. Affect. Comput. 2017, 10, 374–393. [Google Scholar] [CrossRef]
  28. Al-Qazzaz, N.K.; Sabir, M.K.; Grammer, K. Gender differences identification from brain regions using spectral relative powers of emotional EEG. In Proceedings of the 2019 7th International Work-Conference on Bioinformatics and Biomedical Engineering, Granada, Spain, 8–10 May 2019; pp. 38–42. [Google Scholar]
  29. Al-Qazzaz, N.K.; Sabir, M.K.; Grammer, K. Correlation indices of electroencephalogram-based relative powers during human emotion processing. In Proceedings of the 2019 9th International Conference on Biomedical Engineering and Technology, Tokyo, Japan, 28–30 March 2019; pp. 64–70. [Google Scholar]
  30. Al-Qazzaz, N.K.; Sabir, M.K.; Ali, S.; Ahmad, S.A.; Grammer, K. Effective EEG Channels for emotion identification over the brain regions using differential evolution algorithm. In Proceedings of the 2019 41th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019. [Google Scholar]
  31. Xie, S.; Krishnan, S. Wavelet-based sparse functional linear model with applications to EEGs seizure detection and epilepsy diagnosis. Med. Biol. Eng. Comput. 2013, 51, 49–60. [Google Scholar] [CrossRef]
  32. Abásolo, D.; Hornero, R.; Espino, P.; Alvarez, D.; Poza, J. Entropy analysis of the EEG background activity in Alzheimer’s disease patients. Physiol. Meas. 2006, 27, 241. [Google Scholar] [CrossRef] [Green Version]
  33. Al-Qazzaz, N.; Hamid Bin Mohd Ali, S.; Ahmad, S.; Islam, M.; Escudero, J. Automatic artifact removal in EEG of normal and demented individuals using ICA–WT during working memory tasks. Sensors 2017, 17, 1326. [Google Scholar] [CrossRef]
  34. Al-Qazzaz, N.K.; Ali, S.H.B.M.; Ahmad, S.A.; Islam, M.S.; Escudero, J. Discrimination of stroke-related mild cognitive impairment and vascular dementia using EEG signal analysis. Med. Biol. Eng. Comput. 2017, 56, 1–21. [Google Scholar] [CrossRef]
  35. Davidson, P.R.; Jones, R.D.; Peiris, M.T. EEG-based lapse detection with high temporal resolution. IEEE Trans. Biomed. Eng. 2007, 54, 832–839. [Google Scholar] [CrossRef]
  36. Vecchio, F.; Babiloni, C.; Lizio, R.; Fallani, F.V.; Blinowska, K.; Verrienti, G.; Frisoni, G.; Rossini, P. Resting state cortical EEG rhythms in Alzheimer’s disease: Toward EEG markers for clinical applications: A review. Suppl. Clin. Neurophysiol. 2012, 62, 223–236. [Google Scholar]
  37. Al-Qazzaz, N.K.; Ali, S.H.B.; Ahmad, S.A.; Chellappan, K.; Islam, M.S.; Escudero, J. Role of EEG as biomarker in the early detection and classification of dementia. Sci. World J. 2014, 2014. [Google Scholar] [CrossRef] [PubMed]
  38. Urigüen, J.A.; Garcia-Zapirain, B. EEG artifact removal-state-of-the-art and guidelines. J. Neural Eng. 2015, 12, 031001. [Google Scholar] [CrossRef] [PubMed]
  39. Al-Kadi, M.I.; Reaz, M.B.I.; Ali, M.A.M.; Liu, C.Y. Reduction of the dimensionality of the EEG channels during scoliosis correction surgeries using a wavelet decomposition technique. Sensors 2014, 14, 13046–13069. [Google Scholar] [CrossRef] [Green Version]
  40. Pizzagalli, D.A. Electroencephalography and high-density electrophysiological source localization. Handb. Psychophysiol. 2007, 3, 56–84. [Google Scholar]
  41. Jeong, J. EEG dynamics in patients with Alzheimer’s disease. Clin. Neurophysiol. 2004, 115, 1490–1505. [Google Scholar] [CrossRef]
  42. John, E.; Prichep, L.; Fridman, J.; Easton, P. Neurometrics: Computer-assisted differential diagnosis of brain dysfunctions. Science 1988, 239, 162–169. [Google Scholar] [CrossRef]
  43. Leuchter, A.F.; Cook, I.A.; Newton, T.F.; Dunkin, J.; Walter, D.O.; Rosenberg-Thompson, S.; Lachenbruch, P.A.; Weiner, H. Regional differences in brain electrical activity in dementia: Use of spectral power and spectral ratio measures. Electroencephalogr. Clin. Neurophysiol. 1993, 87, 385–393. [Google Scholar] [CrossRef]
  44. Lizio, R.; Vecchio, F.; Frisoni, G.B.; Ferri, R.; Rodriguez, G.; Babiloni, C. Electroencephalographic rhythms in Alzheimer’s disease. Int. J. Alzheimer’s Dis. 2011, 2011. [Google Scholar] [CrossRef] [Green Version]
  45. Yuvaraj, R.; Murugappan, M.; Ibrahim, N.M.; Omar, M.I.; Sundaraj, K.; Mohamad, K.; Palaniappan, R.; Mesquita, E.; Satiyan, M. On the analysis of EEG power, frequency and asymmetry in Parkinson’s disease during emotion processing. Behav. Brain Funct. 2014, 10, 12. [Google Scholar] [CrossRef] [Green Version]
  46. Yang, Y.; Wu, Q.; Qiu, M.; Wang, Y.; Chen, X. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar]
  47. Li, M.; Xu, H.; Liu, X.; Lu, S. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol. Health Care 2018, 26, 1–11. [Google Scholar] [CrossRef] [PubMed]
  48. Chao, H.; Zhi, H.; Dong, L.; Liu, Y. Recognition of emotions using multichannel EEG data and DBN-GC-based ensemble deep learning framework. Comput. Intell. Neurosci. 2018, 2018. [Google Scholar] [CrossRef] [PubMed]
  49. Thammasan, N.; Moriyama, K.; Fukui, K.-i.; Numao, M. Continuous music-emotion recognition based on electroencephalogram. IEICE Trans. Inf. Syst. 2016, 99, 1234–1241. [Google Scholar] [CrossRef] [Green Version]
  50. Sourina, O.; Liu, Y.; Nguyen, M.K. Real-time EEG-based emotion recognition for music therapy. J. Multimodal User Interfaces 2012, 5, 27–35. [Google Scholar] [CrossRef]
  51. Chandran, V.; Acharya, R.; Lim, C. Higher order spectral (HOS) analysis of epileptic EEG signals. In Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 6495–6498. [Google Scholar]
  52. Lan, Z.; Sourina, O.; Wang, L.; Liu, Y. Real-time EEG-based emotion monitoring using stable features. Vis. Comput. 2016, 32, 347–358. [Google Scholar] [CrossRef]
  53. Jin, M.J.; Kim, J.S.; Kim, S.; Hyun, M.H.; Lee, S.-H. An integrated model of emotional problems, beta power of electroencephalography, and low frequency of heart rate variability after childhood trauma in a non-clinical sample: A path analysis study. Front. Psychiatry 2018, 8, 314. [Google Scholar] [CrossRef] [Green Version]
  54. García-Martínez, B.; Martínez-Rodrigo, A.; Zangróniz Cantabrana, R.; Pastor García, J.; Alcaraz, R. Application of entropy-based metrics to identify emotional distress from electroencephalographic recordings. Entropy 2016, 18, 221. [Google Scholar] [CrossRef]
  55. Mehmood, R.M.; Lee, H.J. Towards emotion recognition of EEG brain signals using Hjorth parameters and SVM. Adv. Sci. Technol. Lett. Biosci. Med. Res. 2015, 91, 24–27. [Google Scholar]
  56. Ruiz-Padial, E.; Ibanez-Molina, A.J. Fractal dimension of EEG signals and heart dynamics in discrete emotional states. Biol. Psychol. 2018, 137, 42–48. [Google Scholar] [CrossRef]
  57. Yuen, C.T.; San San, W.; Seong, T.C.; Rizon, M. Classification of human emotions from EEG signals using statistical features and neural network. Int. J. Integr. Eng. 2009, 1. [Google Scholar] [CrossRef]
  58. Yuen, C.T.; San San, W.; Ho, J.-H.; Rizon, M. Effectiveness of statistical features for human emotions classification using EEG biosensors. Res. J. Appl. Sci. Eng. Technol. 2013, 5, 5083–5089. [Google Scholar] [CrossRef]
  59. Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
  60. Azami, H.; Escudero, J. Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation. Comput. Methods Programs Biomed. 2016, 128, 40–51. [Google Scholar] [CrossRef]
  61. Martínez-Rodrigo, A.; García-Martínez, B.; Zunino, L.; Alcaraz, R.; Fernández-Caballero, A. Multi-lag analysis of symbolic entropies on EEG recordings for distress recognition. Front. Neuroinform. 2019, 13, 40. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Li, L.; Cao, R.; Xiang, J. Comparative study of approximate entropy and sample entropy based on characterization of EEG. Comput. Eng. Des. 2014, 35, 1021–1026. [Google Scholar]
  63. Tian, J.; Luo, Z. Motor imagery EEG feature extraction based on fuzzy entropy. J. Huazhong Univ. Sci. Technol 2013, 41, 92–95. [Google Scholar]
  64. Cao, Y.; Cai, L.; Wang, J.; Wang, R.; Yu, H.; Cao, Y.; Liu, J. Characterization of complexity in the electroencephalograph activity of Alzheimer’s disease based on fuzzy entropy. Chaos Interdiscip. J. Nonlinear Sci. 2015, 25, 083116. [Google Scholar] [CrossRef]
  65. Azami, H.; Fernández, A.; Escudero, J. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis. Med. Biol. Eng. Comput. 2017, 55, 2037–2052. [Google Scholar] [CrossRef]
  66. Zheng, J.; Tu, D.; Pan, H.; Hu, X.; Liu, T.; Liu, Q. A refined composite multivariate multiscale fuzzy entropy and laplacian score-based fault diagnosis method for rolling bearings. Entropy 2017, 19, 585. [Google Scholar] [CrossRef] [Green Version]
  67. Azami, H.; Escudero, J. Refined composite multivariate generalized multiscale fuzzy entropy: A tool for complexity analysis of multichannel signals. Phys. A Stat. Mech. Appl. 2017, 465, 261–276. [Google Scholar] [CrossRef] [Green Version]
  68. Al-Qazzaz, N.K.; Ali, S.; Islam, M.S.; Ahmad, S.A.; Escudero, J. EEG markers for early detection and characterization of vascular dementia during working memory tasks. In Proceedings of the 2016 IEEE EMBS Conference on Biomedical Engineering and Sciences (IECBES), Kuala Lumpur, Malaysia, 4–8 December 2016; pp. 347–351. [Google Scholar]
  69. Al-Qazzaz, N.K.; Ali, S.; Islam, S.; Ahmad, S.; Escudero, J. EEG wavelet spectral analysis during a working memory tasks in stroke-related mild cognitive impairment patients. In Proceedings of the International Conference for Innovation in Biomedical Engineering and Life Sciences, Putrajaya, Malaysia, 6–8 December 2015; pp. 82–85. [Google Scholar]
  70. Al-Qazzaz, N.K.; Ali, S.; Ahmad, S.A.; Islam, M.S.; Escudero, J. Entropy-based markers of EEG background activity of stroke-related mild cognitive impairment and vascular dementia patients. In Proceedings of the 2nd International Conference on Sensors Engineering and Electronics Instrumental Advances (SEIA 2016), Barcelona, Spain, 22–23 September 2016. [Google Scholar]
  71. Xing, B.; Zhang, H.; Zhang, K.; Zhang, L.; Wu, X.; Shi, X.; Yu, S.; Zhang, S. Exploiting EEG signals and audiovisual feature fusion for video emotion recognition. IEEE Access 2019, 7, 59844–59861. [Google Scholar] [CrossRef]
  72. Rottenberg, J.; Gross, J.J.; Wilhelm, F.H.; Najmi, S.; Gotlib, I.H. Crying threshold and intensity in major depressive disorder. J. Abnorm. Psychol. 2002, 111, 302. [Google Scholar] [CrossRef] [PubMed]
  73. Abásolo, D.; Escudero, J.; Hornero, R.; Gómez, C.; Espino, P. Approximate entropy and auto mutual information analysis of the electroencephalogram in Alzheimer’s disease patients. Med. Biol. Eng. Comput. 2008, 46, 1019–1028. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  74. Teplan, M.; Krakovská, A.; Špajdel, M. Spectral EEG features of a short psycho-physiological relaxation. Meas. Sci. Rev. 2014, 14, 237–242. [Google Scholar] [CrossRef] [Green Version]
  75. Kang, J.; Zhou, T.; Han, J.; Li, X. EEG-based multi-feature fusion assessment for autism. J. Clin. Neurosci. 2018, 56, 101–107. [Google Scholar] [CrossRef]
  76. Clerico, A.; Tiwari, A.; Gupta, R.; Jayaraman, S.; Falk, T.H. Electroencephalography amplitude modulation analysis for automated affective tagging of music video clips. Front. Comput. Neurosci. 2018, 11, 115. [Google Scholar] [CrossRef]
  77. Mammone, N.; Morabito, F.C. Enhanced automatic artifact detection based on independent component analysis and Renyi’s entropy. Neural Netw. 2008, 21, 1029–1040. [Google Scholar] [CrossRef]
  78. Escudero, J.; Hornero, R.; Abásolo, D.; Fernández, A. Quantitative evaluation of artifact removal in real magnetoencephalogram signals with blind source separation. Ann. Biomed. Eng. 2011, 39, 2274–2286. [Google Scholar] [CrossRef] [Green Version]
  79. Escudero, J.; Hornero, R.; Abásolo, D.; Fernández, A.; López-Coronado, M. Artifact removal in magnetoencephalogram background activity with independent component analysis. IEEE Trans. Biomed. Eng. 2007, 54, 1965–1973. [Google Scholar] [CrossRef] [Green Version]
  80. Davidson, R.J.; Begley, S. The Emotional Life of Your Brain: How Its Unique Patterns Affect the Way You Think, Feel, and Live--and How You Can Change Them; Hachette: London, UK, 2012. [Google Scholar]
  81. Goghari, V.M.; MacDonald III, A.W.; Sponheim, S.R. Temporal lobe structures and facial emotion recognition in schizophrenia patients and nonpsychotic relatives. Schizophr. Bull. 2010, 37, 1281–1294. [Google Scholar] [CrossRef]
  82. Lawrence, E.M.; Rogers, R.G.; Wadsworth, T. Happiness and longevity in the United States. Soc. Sci. Med. 2015, 145, 115–119. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  83. Wolkowitz, O.M.; Epel, E.S.; Reus, V.I.; Mellon, S.H. Depression gets old fast: Do stress and depression accelerate cell aging? Depress. Anxiety 2010, 27, 327–338. [Google Scholar] [CrossRef] [PubMed]
  84. Chellappan, K.; Mohsin, N.K.; Ali, S.H.B.M.; Islam, M. Post-stroke Brain Memory Assessment Framework. In Proceedings of the IEEE-EMBS Conference on Biomedical Engineering and Sciences, Langkawi, Malaysia, 17–19 December 2012. [Google Scholar]
  85. Staicu, M.-L.; Cuţov, M. Anger and health risk behaviors. J. Med. Life 2010, 3, 372–375. [Google Scholar] [PubMed]
Figure 1. The block diagram of the proposed study.
Figure 1. The block diagram of the proposed study.
Sensors 20 00059 g001
Figure 2. The experimental protocol of emotion.
Figure 2. The experimental protocol of emotion.
Sensors 20 00059 g002
Figure 3. The comparative plot of the spectral biomarker for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the frontal, temporal, parietal and occipital brain regions.
Figure 3. The comparative plot of the spectral biomarker for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the frontal, temporal, parietal and occipital brain regions.
Sensors 20 00059 g003
Figure 4. The comparative plot of the entropy biomarker for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the brain regions.
Figure 4. The comparative plot of the entropy biomarker for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the brain regions.
Sensors 20 00059 g004
Figure 5. The comparative plot of the temporal biomarkers for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the brain regions.
Figure 5. The comparative plot of the temporal biomarkers for anger, anxiety, disgust, happiness, sadness, surprise and neutral emotional states over the brain regions.
Sensors 20 00059 g005
Figure 6. S S n e u t r a l S S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e   correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Figure 6. S S n e u t r a l S S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e   correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Sensors 20 00059 g006
Figure 7. E S n e u t r a l E S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Figure 7. E S n e u t r a l E S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Sensors 20 00059 g007
Figure 8. T S n e u t r a l T S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e   correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Figure 8. T S n e u t r a l T S a n g e r ,   a n x i e t y ,   d i s g u s t , h a p i n e s s , s a d , s u r p r i s e   correlations occurring in the frontal, temporal, parietal and occipital brain regions. Correlations of significance at 0.05 level (2-tailed).
Sensors 20 00059 g008
Table 1. Sociodemographic data of the subjects with self-assessment questionnaire (SAQ) scores shown. (Age in years, SAQ mean ± standard deviation SD).
Table 1. Sociodemographic data of the subjects with self-assessment questionnaire (SAQ) scores shown. (Age in years, SAQ mean ± standard deviation SD).
Demographic and Clinical FeaturesSubjects
Number40
Age 22.475 ± 2.522
Female/Male17F/23M
SAQAnger4.052 ± 2.001
Anxiety1.844 ± 2.591
Disgust3.859 ± 2.843
Happiness2.204 ± 2.947
Sadness1.804 ± 2.365
surprise2.093 ± 2.438
Table 2. Emotions multiple comparison test using Bonferroni for the spectral biomarker.
Table 2. Emotions multiple comparison test using Bonferroni for the spectral biomarker.
(I) Emotion_Class(J) Emotion_ClassMean Difference (I-J) p - Value   a
NeutralAnger0.6590.123
Anxiety−0.0961
Disgust−0.3491
Sadness−0.8910.004 *
Surprise0.1141
Happiness−1.0320.05 *
AngerAnxiety−0.7550.033
Disgust−1.0080.001
Sadness−1.550.05 *
Surprise−0.5450.477
Happiness−1.6910.05 *
AnxietyDisgust−0.2531
Sadness−0.7950.019 *
Surprise0.211
Happiness−0.9360.002 *
DisgustSadness−0.5420.492
Surprise0.4631
Happiness−0.6830.09
SadnessSurprise1.0050.001 *
Happiness−0.1411
SurpriseHappiness−1.1460.05 *
* The mean difference is significant at the 0.05 level. a Adjustment for multiple comparisons: Bonferroni.
Table 3. Brain regions multiple comparison test using Bonferroni for the spectral biomarker.
Table 3. Brain regions multiple comparison test using Bonferroni for the spectral biomarker.
(I) Brain Region(J) Brain RegionMean Difference (I-J) p - Value   a
FrontalTemporal1.6570.05 *
Parietal2.8120.05 *
Occipital2.310.05 *
TemporalParietal1.1550.05 *
Occipital0.6530.038
ParietalOccipital−0.5020.215
* The mean difference is significant at the 0.05 level. a Adjustment for multiple comparisons: Bonferroni.
Table 4. Emotions multiple comparison test using Bonferroni for entropy biomarker.
Table 4. Emotions multiple comparison test using Bonferroni for entropy biomarker.
Emotion(J) Class_EmotionMean Difference (I-J) p - Value   a
NeutralAnger−0.030.095
Anxiety−0.0041
Disgust0.0141
Sadness−0.0161
Surprise−0.0041
Happiness0.0250.435
AngerAnxiety0.0260.323
Disgust0.0440.001 *
Sadness0.0141
Surprise0.0270.26
happiness0.0550.05 *
AnxietyDisgust0.0181
Sadness−0.0121
Surprise0.0011
Happiness0.0290.134
DisgustSadness−0.030.109
Surprise−0.0171
Happiness0.0111
SadnessSurprise0.0121
Happiness0.0410.003 *
SurpriseHappiness0.0280.169
* The mean difference is significant at the 0.05 level. a Adjustment for multiple comparisons: Bonferroni.
Table 5. Brain regions multiple comparison test using Bonferroni for entropy biomarker.
Table 5. Brain regions multiple comparison test using Bonferroni for entropy biomarker.
(I) Brain Region(J) Brain RegionMean Difference (I-J) p - Value   a
FrontalTemporal−0.0720.05 *
Parietal−0.0620.05 *
Occipital−0.0580.05 *
TemporalParietal0.011
Occipital0.0141
ParietalOccipital0.0041
* The mean difference is significant at the 0.05 level. a Adjustment for multiple comparisons: Bonferroni.
Table 6. Emotions multiple comparison test using Bonferroni for the temporal biomarker.
Table 6. Emotions multiple comparison test using Bonferroni for the temporal biomarker.
(I) Class_Emotion(J) Class_EmotionMean Difference (I-J) p - Value   a
NeutralAnger0.0311
Anxiety0.0511
Disgust−0.0341
Sadness0.0091
Surprise0.0621
Happiness0.0241
AngerAnxiety0.021
Disgust−0.0651
Sadness−0.0221
Surprise0.0311
Happiness−0.0071
AnxietyDisgust−0.0851
Sadness−0.0421
Surprise0.0111
Happiness−0.0271
DisgustSadness0.0431
Surprise0.0960.837
Happiness0.0581
SadnessSurprise0.0531
Happiness0.0151
Surprisehappiness−0.0381
a Adjustment for multiple comparisons: Bonferroni.
Table 7. Brain regions multiple comparison test using Bonferroni for the temporal biomarker.
Table 7. Brain regions multiple comparison test using Bonferroni for the temporal biomarker.
(I) Brain Region(J) Brain RegionMean Difference (I-J) p - Value   a
FrontalTemporal0.1830.05 *
Parietal0.1690.05 *
Occipital0.1360.001 *
TemporalParietal−0.0141
Occipital−0.0471
ParietalOccipital−0.0341
* The mean difference is significant at the 0.05 level. a Adjustment for multiple comparisons: Bonferroni.
Table 8. The most correlated emotions with neutral for spectro spatial   ( S S ) ,   entropy spatial   ( E S )   and   temporo spatial   ( T S )   profiles   over the frontal, temporal, parietal and occipital brain regions.
Table 8. The most correlated emotions with neutral for spectro spatial   ( S S ) ,   entropy spatial   ( E S )   and   temporo spatial   ( T S )   profiles   over the frontal, temporal, parietal and occipital brain regions.
ProfilesFrontalTemporalParietalOccipital
S S anger, anxiety, sadness, surpriseanger, anxiety, happinessanger, sadness, surpriseanger, sadness, surprise
E S anxiety, sadnessanxiety, sadnessanxietyanxiety
T S sadnesssadnessanger, sadness, surprisedisgust

Share and Cite

MDPI and ACS Style

Al-Qazzaz, N.K.; Sabir, M.K.; Ali, S.H.B.M.; Ahmad, S.A.; Grammer, K. Electroencephalogram Profiles for Emotion Identification over the Brain Regions Using Spectral, Entropy and Temporal Biomarkers. Sensors 2020, 20, 59. https://doi.org/10.3390/s20010059

AMA Style

Al-Qazzaz NK, Sabir MK, Ali SHBM, Ahmad SA, Grammer K. Electroencephalogram Profiles for Emotion Identification over the Brain Regions Using Spectral, Entropy and Temporal Biomarkers. Sensors. 2020; 20(1):59. https://doi.org/10.3390/s20010059

Chicago/Turabian Style

Al-Qazzaz, Noor Kamal, Mohannad K. Sabir, Sawal Hamid Bin Mohd Ali, Siti Anom Ahmad, and Karl Grammer. 2020. "Electroencephalogram Profiles for Emotion Identification over the Brain Regions Using Spectral, Entropy and Temporal Biomarkers" Sensors 20, no. 1: 59. https://doi.org/10.3390/s20010059

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop