Open Access
ARTICLE
Algorithm of Helmet Wearing Detection Based on AT-YOLO Deep Mode
1 College of Computer Science and Information Technology, Central South University of Forestry & Technology, Changsha, 410004, China
2 Department of Mathematics and Computer Science, Northeastern State University, Tahlequah, 74464, OK, USA
* Corresponding Author: Jiaohua Qin. Email:
Computers, Materials & Continua 2021, 69(1), 159-174. https://doi.org/10.32604/cmc.2021.017480
Received 31 January 2021; Accepted 16 March 2021; Issue published 04 June 2021
Abstract
The existing safety helmet detection methods are mainly based on one-stage object detection algorithms with high detection speed to reach the real-time detection requirements, but they can’t accurately detect small objects and objects with obstructions. Therefore, we propose a helmet detection algorithm based on the attention mechanism (AT-YOLO). First of all, a channel attention module is added to the YOLOv3 backbone network, which can adaptively calibrate the channel features of the direction to improve the feature utilization, and a spatial attention module is added to the neck of the YOLOv3 network to capture the correlation between any positions in the feature map so that to increase the receptive field of the network. Secondly, we use DIoU (Distance Intersection over Union) bounding box regression loss function, it not only improving the measurement of bounding box regression loss but also increases the normalized distance loss between the prediction boxes and the target boxes, which makes the network more accurate in detecting small objects and faster in convergence. Finally, we explore the training strategy of the network model, which improves network performance without increasing the inference cost. Experiments show that the mAP of the proposed method reaches 96.5%, and the detection speed can reach 27 fps. Compared with other existing methods, it has better performance in detection accuracy and speed.Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.