Open Access
ARTICLE
An Efficient Crossing-Line Crowd Counting Algorithm with Two-Stage Detection
School of Computer, Jiaying University, Meizhou, 514015, China.
School of Design, Jiangnan University, Wuxi, 214122, China.
School of Computer Science, Assumption University, Bangkok, 351078, Thailand.
* Corresponding Author: Zhenqiu Xiao. Email: .
Computers, Materials & Continua 2019, 60(3), 1141-1154. https://doi.org/10.32604/cmc.2019.05638
Abstract
Crowd counting is a challenging task in crowded scenes due to heavy occlusions, appearance variations and perspective distortions. Current crowd counting methods typically operate on an image patch level with overlaps, then sum over the patches to get the final count. In this paper we describe a real-time pedestrian counting framework based on a two-stage human detection algorithm. Existing works with overhead cameras is mainly based on visual tracking, and their robustness is rather limited. On the other hand, some works, which focus on improving the performances, are too complicated to be realistic. By adopting a line sampling process, a temporal slice image can be obtained for pedestrian counting without the need for visual tracking. Only ten low level features are extracted from the input image to establish a feature vector. As a result, our algorithm is more efficient and accurate than existing methods. Pedestrians in the temporal slice image are then located by the two-stage detection algorithm, which is largely based on support vector machine and affinity propagation clustering. Moreover, a novel algorithm is proposed to determine the moving directions of pedestrians by comparing the centers of them in two temporal slice images. Extensive experiments reveal that our system achieves satisfaction performances in terms of both robustness and efficiency.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.