Open Access
ARTICLE
Fire Detection Algorithm Based on an Improved Strategy of YOLOv5 and Flame Threshold Segmentation
School of Information Science and Technology, Hainan Normal University, Haikou, 571158, China
* Corresponding Author: Shulei Wu. Email: ; Wang Yaoru. Email:
Computers, Materials & Continua 2023, 75(3), 5639-5657. https://doi.org/10.32604/cmc.2023.037829
Received 17 November 2022; Accepted 10 March 2023; Issue published 29 April 2023
Abstract
Due to the rapid growth and spread of fire, it poses a major threat to human life and property. Timely use of fire detection technology can reduce disaster losses. The traditional threshold segmentation method is unstable, and the flame recognition methods of deep learning require a large amount of labeled data for training. In order to solve these problems, this paper proposes a new method combining You Only Look Once version 5 (YOLOv5) network model and improved flame segmentation algorithm. On the basis of the traditional color space threshold segmentation method, the original segmentation threshold is replaced by the proportion threshold, and the characteristic information of the flame is maximally retained. In the YOLOv5 network model, the training module is set by combining the ideas of Bootstrapping and cross validation, and the data distribution of YOLOv5 network training is adjusted. At the same time, the feature information after segmentation is added to the data set. Different from the training method that uses large-scale data sets for model training, the proposed method trains the model on the basis of a small data set, and achieves better model detection results, and the detection accuracy of the model in the validation set reaches 0.96. Experimental results show that the proposed method can detect flame features with faster speed and higher accuracy compared with the original method.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.