Open Access
ARTICLE
Non-contact Real-time Monitoring of Driver’s Physiological Parameters under Ambient Light Condition
1 Beijing Key Laboratory of Urban Rod Traffic Intelligent Control Technology, North China University of Technology, Beijing, 100144, China
2 College of Sciences, North China University of Technology, Beijing, 100144, China
3 Department of Mathematical Sciences, Middle Tennessee State University Murfreesboro, TN, 37132, USA
* Corresponding Author: Jiancheng Zou. Email:
Intelligent Automation & Soft Computing 2021, 28(3), 811-822. https://doi.org/10.32604/iasc.2021.016516
Received 04 January 2021; Accepted 06 March 2021; Issue published 20 April 2021
Abstract
Real-time and effective monitoring of a driver’s physiological parameters and psychological states can provide early warnings and help avoid traffic accidents. In this paper, we propose a non-contact real-time monitoring algorithm for physiological parameters of drivers under ambient light conditions. First, video sequences of the driver’s head are obtained by an ordinary USB camera and the AdaBoost algorithm is used to locate the driver’s facial region. Second, a face expression recognition algorithm based on an improved convolutional neural network (CNN) is proposed to recognize the driver’s facial expression. The forehead region is divided into three RGB channels as the region of interest (ROI), and the ICA algorithm is used to separate the ROI into three independent components. After that, the most significant component is selected for calculation of the heart rate and respiratory rate of the driver. Comparing the experimental results with the indications of finger clip devices, the proposed algorithm can monitor a driver’s physiological parameters in real time in a non-contact way that will not interfere with normal driving. The results of facial expression recognition can help verify the monitoring results of physiological parameters, and therefore, more accurately evaluate drivers’ physical condition.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.