Open Access
ARTICLE
Deep Transfer Learning Approach for Robust Hand Detection
1 Faculty of Electronic Engineering, University of Nis, Nis, 18000, Serbia
2 Faculty of Mechanical Engineering, University of Nis, Nis, 18000, Serbia
* Corresponding Author: Stevica Cvetkovic. Email:
Intelligent Automation & Soft Computing 2023, 36(1), 967-979. https://doi.org/10.32604/iasc.2023.032526
Received 20 May 2022; Accepted 06 July 2022; Issue published 29 September 2022
Abstract
Human hand detection in uncontrolled environments is a challenging visual recognition task due to numerous variations of hand poses and background image clutter. To achieve highly accurate results as well as provide real-time execution, we proposed a deep transfer learning approach over the state-of-the-art deep learning object detector. Our method, denoted as YOLOHANDS, is built on top of the You Only Look Once (YOLO) deep learning architecture, which is modified to adapt to the single class hand detection task. The model transfer is performed by modifying the higher convolutional layers including the last fully connected layer, while initializing lower non-modified layers with the generic pretrained weights. To address robustness issues, we introduced a comprehensive augmentation procedure over the training image dataset, specifically adapted for the hand detection problem. Experimental evaluation of the proposed method, which is performed on a challenging public dataset, has demonstrated highly accurate results, comparable to the state-of-the-art methods.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.