Vol.60, No.3, 2019, pp.1041-1054, doi:10.32604/cmc.2019.05605
OPEN ACCESS
ARTICLE
Attention-Aware Network with Latent Semantic Analysis for Clothing Invariant Gait Recognition
  • Hefei Ling1, Jia Wu1, Ping Li1,*, Jialie Shen2
HuaZhong University of Science and Technology, Wuhan, 430074, China.
Queen’s University, Belfast, UK.
* Corresponding Author: Ping Li. Email: .
Abstract
Gait recognition is a complicated task due to the existence of co-factors like carrying conditions, clothing, viewpoints, and surfaces which change the appearance of gait more or less. Among those co-factors, clothing analysis is the most challenging one in the area. Conventional methods which are proposed for clothing invariant gait recognition show the body parts and the underlying relationships from them are important for gait recognition. Fortunately, attention mechanism shows dramatic performance for highlighting discriminative regions. Meanwhile, latent semantic analysis is known for the ability of capturing latent semantic variables to represent the underlying attributes and capturing the relationships from the raw input. Thus, we propose a new CNN-based method which leverages advantage of the latent semantic analysis and attention mechanism. Based on discriminative features extracted using attention and the latent semantic analysis module respectively, multi-modal fusion method is proposed to fuse those features for its high fault tolerance in the decision level. Experiments on the most challenging clothing variation dataset: OU-ISIR TEADMILL dataset B show that our method outperforms other state-of-art gait approaches.
Keywords
Gait recognition, latent semantic analysis, attention mechanism, attention-aware neural network, clothing-invariant, feature fusion.
Cite This Article
H. . Ling, J. . Wu, P. . Li and J. . Shen, "Attention-aware network with latent semantic analysis for clothing invariant gait recognition," Computers, Materials & Continua, vol. 60, no.3, pp. 1041–1054, 2019.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.