Open Access
ARTICLE
EEG Emotion Recognition Using an Attention Mechanism Based on an Optimized Hybrid Model
1 Brain Cognitive Computing Lab, School of Information Engineering, Minzu University of China, Beijing, 100081, China
2 Case Western Reserve University, USA
* Corresponding Author: Huiping Jiang. Email:
Computers, Materials & Continua 2022, 73(2), 2697-2712. https://doi.org/10.32604/cmc.2022.027856
Received 28 January 2022; Accepted 19 April 2022; Issue published 16 June 2022
Abstract
Emotions serve various functions. The traditional emotion recognition methods are based primarily on readily accessible facial expressions, gestures, and voice signals. However, it is often challenging to ensure that these non-physical signals are valid and reliable in practical applications. Electroencephalogram (EEG) signals are more successful than other signal recognition methods in recognizing these characteristics in real-time since they are difficult to camouflage. Although EEG signals are commonly used in current emotional recognition research, the accuracy is low when using traditional methods. Therefore, this study presented an optimized hybrid pattern with an attention mechanism (FFT_CLA) for EEG emotional recognition. First, the EEG signal was processed via the fast fourier transform (FFT), after which the convolutional neural network (CNN), long short-term memory (LSTM), and CNN-LSTM-attention (CLA) methods were used to extract and classify the EEG features. Finally, the experiments compared and analyzed the recognition results obtained via three DEAP dataset models, namely FFT_CNN, FFT_LSTM, and FFT_CLA. The final experimental results indicated that the recognition rates of the FFT_CNN, FFT_LSTM, and FFT_CLA models within the DEAP dataset were 87.39%, 88.30%, and 92.38%, respectively. The FFT_CLA model improved the accuracy of EEG emotion recognition and used the attention mechanism to address the often-ignored importance of different channels and samples when extracting EEG features.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.