Open Access
ARTICLE
Robust Counting in Overcrowded Scenes Using Batch-Free Normalized Deep ConvNet
1 Institute of Computer Sciences and Information Technology (ICS/IT), The University of Agriculture, Peshawar, 25130, Khyber Pakhtunkhwa, Pakistan
2 Department of Convergence Engineering for Intelligent Drone, Sejong University, Seoul, 05006, Korea
3 Collaborative Robotics and Intelligent Systems (CoRIS) Institute, Oregon State University, OR, USA
4 Department of Software, Sejong University, Seoul, 05006, Korea
* Corresponding Authors: Amin Ullah. Email: ; Mi Young Lee. Email:
Computer Systems Science and Engineering 2023, 46(3), 2741-2754. https://doi.org/10.32604/csse.2023.037706
Received 14 November 2022; Accepted 09 February 2023; Issue published 03 April 2023
Abstract
The analysis of overcrowded areas is essential for flow monitoring, assembly control, and security. Crowd counting’s primary goal is to calculate the population in a given region, which requires real-time analysis of congested scenes for prompt reactionary actions. The crowd is always unexpected, and the benchmarked available datasets have a lot of variation, which limits the trained models’ performance on unseen test data. In this paper, we proposed an end-to-end deep neural network that takes an input image and generates a density map of a crowd scene. The proposed model consists of encoder and decoder networks comprising batch-free normalization layers known as evolving normalization (EvoNorm). This allows our network to be generalized for unseen data because EvoNorm is not using statistics from the training samples. The decoder network uses dilated 2D convolutional layers to provide large receptive fields and fewer parameters, which enables real-time processing and solves the density drift problem due to its large receptive field. Five benchmark datasets are used in this study to assess the proposed model, resulting in the conclusion that it outperforms conventional models.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.