Open Access iconOpen Access

ARTICLE

A Novel Locomotion Rule Rmbedding Long Short-Term Memory Network with Attention for Human Locomotor Intent Classification Using Multi-Sensors Signals

Jiajie Shen1, Yan Wang1,*, Dongxu Zhang2

1 Key Laboratory of Symbol Computation and Knowledge Engineering, Ministry of Education, Colleague of Computer Science and Technology, Jilin University, Changchun, 130012, China
2 College of Software, and Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun, 130012, China

* Corresponding Author: Yan Wang. Email: email

Computers, Materials & Continua 2024, 79(3), 4349-4370. https://doi.org/10.32604/cmc.2024.047903

Abstract

Locomotor intent classification has become a research hotspot due to its importance to the development of assistive robotics and wearable devices. Previous work have achieved impressive performance in classifying steady locomotion states. However, it remains challenging for these methods to attain high accuracy when facing transitions between steady locomotion states. Due to the similarities between the information of the transitions and their adjacent steady states. Furthermore, most of these methods rely solely on data and overlook the objective laws between physical activities, resulting in lower accuracy, particularly when encountering complex locomotion modes such as transitions. To address the existing deficiencies, we propose the locomotion rule embedding long short-term memory (LSTM) network with Attention (LREAL) for human locomotor intent classification, with a particular focus on transitions, using data from fewer sensors (two inertial measurement units and four goniometers). The LREAL network consists of two levels: One responsible for distinguishing between steady states and transitions, and the other for the accurate identification of locomotor intent. Each classifier in these levels is composed of multiple-LSTM layers and an attention mechanism. To introduce real-world motion rules and apply constraints to the network, a prior knowledge was added to the network via a rule-modulating block. The method was tested on the ENABL3S dataset, which contains continuous locomotion date for seven steady and twelve transitions states. Experimental results showed that the LREAL network could recognize locomotor intents with an average accuracy of 99.03% and 96.52% for the steady and transitions states, respectively. It is worth noting that the LREAL network accuracy for transition-state recognition improved by 0.18% compared to other state-of-the-art network, while using data from fewer sensors.

Keywords


Cite This Article

APA Style
Shen, J., Wang, Y., Zhang, D. (2024). A novel locomotion rule rmbedding long short-term memory network with attention for human locomotor intent classification using multi-sensors signals. Computers, Materials & Continua, 79(3), 4349-4370. https://doi.org/10.32604/cmc.2024.047903
Vancouver Style
Shen J, Wang Y, Zhang D. A novel locomotion rule rmbedding long short-term memory network with attention for human locomotor intent classification using multi-sensors signals. Comput Mater Contin. 2024;79(3):4349-4370 https://doi.org/10.32604/cmc.2024.047903
IEEE Style
J. Shen, Y. Wang, and D. Zhang "A Novel Locomotion Rule Rmbedding Long Short-Term Memory Network with Attention for Human Locomotor Intent Classification Using Multi-Sensors Signals," Comput. Mater. Contin., vol. 79, no. 3, pp. 4349-4370. 2024. https://doi.org/10.32604/cmc.2024.047903



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 163

    View

  • 57

    Download

  • 0

    Like

Share Link