Open Access
ARTICLE
A Novel SE-CNN Attention Architecture for sEMG-Based Hand Gesture Recognition
1
Key Laboratory of Clinical and Medical Engineering, School of Biomedical Engineering and Informatics, Nanjing Medical
University, Nanjing, 211166, China
2
Department of Medical Engineering, Wannan Medical College, Wuhu, 241002, China
3
Postdoctoral Innovation Practice, Shenzhen Polytechnic, Shenzhen, 518055, China
* Corresponding Authors: Jianqing Li. Email: ; Bin Liu. Email:
# Zhengyuan Xu and Junxiao Yu contributed equally to this work
Computer Modeling in Engineering & Sciences 2023, 134(1), 157-177. https://doi.org/10.32604/cmes.2022.020035
Received 30 October 2021; Accepted 03 March 2022; Issue published 24 August 2022
Abstract
In this article, to reduce the complexity and improve the generalization ability of current gesture recognition systems, we propose a novel SE-CNN attention architecture for sEMG-based hand gesture recognition. The proposed algorithm introduces a temporal squeeze-and-excite block into a simple CNN architecture and then utilizes it to recalibrate the weights of the feature outputs from the convolutional layer. By enhancing important features while suppressing useless ones, the model realizes gesture recognition efficiently. The last procedure of the proposed algorithm is utilizing a simple attention mechanism to enhance the learned representations of sEMG signals to perform multi-channel sEMG-based gesture recognition tasks. To evaluate the effectiveness and accuracy of the proposed algorithm, we conduct experiments involving multi-gesture datasets Ninapro DB4 and Ninapro DB5 for both inter-session validation and subject-wise cross-validation. After a series of comparisons with the previous models, the proposed algorithm effectively increases the robustness with improved gesture recognition performance and generalization ability.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.