Open Access
ARTICLE
Quick and Accurate Counting of Rapeseed Seedling with Improved YOLOv5s and Deep-Sort Method
College of Engineering, Huazhong Agricultural University, Wuhan, 430070, China
* Corresponding Author: Yang Yang. Email:
(This article belongs to the Special Issue: Development of New Sensing Technology in Sustainable Farming and Smart Environmental Monitoring)
Phyton-International Journal of Experimental Botany 2023, 92(9), 2611-2632. https://doi.org/10.32604/phyton.2023.029457
Received 20 February 2023; Accepted 12 May 2023; Issue published 28 July 2023
Abstract
The statistics of the number of rapeseed seedlings are very important for breeders and planters to conduct seed quality testing, field crop management and yield estimation. Calculating the number of seedlings is inefficient and cumbersome in the traditional method. In this study, a method was proposed for efficient detection and calculation of rapeseed seedling number based on improved you only look once version 5 (YOLOv5) to identify objects and deep-sort to perform object tracking for rapeseed seedling video. Coordinated attention (CA) mechanism was added to the trunk of the improved YOLOv5s, which made the model more effective in identifying shaded, dense and small rapeseed seedlings. Also, the use of the GSConv module replaced the standard convolution at the neck, reduced model parameters and enabled it better able to be equipped for mobile devices. The accuracy and recall rate of using improved YOLOv5s on the test set by 1.9% and 3.7% compared to 96.2% and 93.7% of YOLOv5s, respectively. The experimental results showed that the average error of monitoring the number of seedlings by unmanned aerial vehicles (UAV) video of rapeseed seedlings based on improved YOLOv5s combined with depth-sort method was 4.3%. The presented approach can realize rapid statistics of the number of rapeseed seedlings in the field based on UAV remote sensing, provide a reference for variety selection and precise management of rapeseed.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.