single-jc.php

JACIII Vol.29 No.2 pp. 306-315
doi: 10.20965/jaciii.2025.p0306
(2025)

Research Paper:

Driver Drowsiness Detection Based on Facial Video Non-Contact Heart Rate Measurement

Fusheng Ding ORCID Icon, Yanbin Qin, Lanxiang Zhang, and Hongming Lyu ORCID Icon

School of Automotive Engineering, Yancheng Institute of Technology
No.1 Hope Avenue Middle Road, Tinghu District, Yancheng, Jiangsu 224051, China

Corresponding author

Received:
July 5, 2024
Accepted:
December 23, 2024
Published:
March 20, 2025
Keywords:
driver drowsiness, contactless heart rate estimation, color space, heart rate signals, drowsiness assessment model
Abstract

Drowsy driving is a major contributor to traffic accidents, making real-time monitoring of driver drowsiness essential for effective preventive measures. This paper presents a novel method for detecting driver drowsiness through facial video analysis and non-contact heart rate measurement. To address the challenges posed by varying lighting conditions, the algorithm integrates RGB (red, green, and blue) and multi-scale reinforced image color space techniques. This combination enhances the robustness of heart rate signal extraction by generating spatio-temporal maps that minimize the impact of low light. A convolutional neural network is used to accurately map these spatio-temporal features to their corresponding heart rate values. To provide a comprehensive assessment of drowsiness, a differential thresholding method is utilized to extract heart rate variability information. Building on this data, a dynamic drowsiness assessment model is developed using long short-term memory networks. Evaluation results on the corresponding dataset demonstrate a high accuracy rate of 95.1%, underscoring the method’s robustness, which means it can greatly enhance the reliability of drowsiness detection systems, ultimately contributing to a reduction in traffic accidents caused by driver fatigue.

Cite this article as:
F. Ding, Y. Qin, L. Zhang, and H. Lyu, “Driver Drowsiness Detection Based on Facial Video Non-Contact Heart Rate Measurement,” J. Adv. Comput. Intell. Intell. Inform., Vol.29 No.2, pp. 306-315, 2025.
Data files:
References
  1. [1] R. O. Phillips, “A review of definitions of fatigue–And a step towards a whole definition,” Transportation Research Part F: Traffic Psychology and Behavior, Vol.29, pp. 48-56, 2015. https://doi.org/10.1016/j.trf.2015.01.003
  2. [2] “Frequency of fatigue-related crashes.” https://road-safety.transport.ec.europa.eu/european-road-safety-observatory/statistics-and-analysis-archive/fatigue/frequency-fatigue-related-crashes_en [Accessed April 1, 2024]
  3. [3] G. Zhao, Y. He, H. Yang, and Y. Tao, “Research on fatigue detection based on visual features,” IET Image Processing, Vol.16, Issue 4, pp. 1044-1053, 2022. https://doi.org/10.1049/ipr2.12207
  4. [4] Z. Sun, Y. Miao, J. Y. Jeon, Y. Kong, and G. Park, “Facial feature fusion convolutional neural network for driver fatigue detection,” Engineering Applications of Artificial Intelligence, Vol.126, Part C, Article No.106981, 2023. https://doi.org/10.1016/j.engappai.2023.106981
  5. [5] Z. Cui, H. Sun, R. Yin, L. Gao, H. Sun, and R. Jia, “Real-time detection method of driver fatigue state based on deep learning of face video,” Multimedia Tools and Applications, Vol.80, pp. 25495-25515, 2021. https://doi.org/10.1007/s11042-021-10930-z
  6. [6] R. Zheng, Z. Wang, Y. He, and J. Zhang, “EEG-based brain functional connectivity representation using amplitude locking value for fatigue-driving recognition,” Cognitive Neurodynamics, Vol.16, No.2, pp. 325-336, 2022. https://doi.org/10.1007/s11571-021-09714-w
  7. [7] V. Balasubramanian and R. Bhardwaj, “Grip and electrophysiological sensor-based estimation of muscle fatigue while holding steering wheel in different positions,” IEEE Sensors J., Vol.19, Issue 5, pp. 1951-1960, 2018. https://doi.org/10.1109/JSEN.2018.2863023
  8. [8] Z. Peng, J. Rong, Y. Wu, C. Zhou, Y. Yuan, and X. Shao, “Exploring the different patterns for generation process of driving fatigue based on individual driving behavior parameters,” Transportation Research Record, Vol.2675, No.8, pp. 408-421, 2021. https://doi.org/10.1177/0361198121998351
  9. [9] P. H. Charlton, P. A. Kyriacou, J. Mant, V. Marozas, P. Chowienczyk, and J. Alastruey, “Wearable photoplethysmography for cardiovascular monitoring,” Proc. of the IEEE, Vol.110, Issue 3, pp. 355-381, 2022. https://doi.org/10.1109/JPROC.2022.3149785
  10. [10] L. C. Lampier, C. T. Valadão, L. A. Silva, D. Delisle-Rodríguez, E. M. de Oliveira Caldeira, and T. F. Bastos-Filho, “A deep learning approach to estimate pulse rate by remote photoplethysmography,” Physiological Measurement, Vol.43, No.7, Article No.075012, 2022. https://doi.org/10.1088/1361-6579/ac7b0b
  11. [11] Z. Yang, H. Wang, and F. Lu, “Assessment of deep learning-based heart rate estimation using remote photoplethysmography under different illuminations,” IEEE Trans. on Human-Machine Systems, Vol.52, Issue 6, pp. 1236-1246, 2022. https://doi.org/10.1109/THMS.2022.3207755
  12. [12] K. Gupta, R. Sinhal, and S. S. Badhiye, “Remote photoplethysmography-based human vital sign prediction using cyclical algorithm,” J. of Biophotonics, Vol.17, Issue 1, Article No.e202300286, 2023. https://doi.org/10.1002/jbio.202300286
  13. [13] S.-Q. Liu and P. C. Yuen, “Robust remote photoplethysmography estimation with environmental noise disentanglement,” IEEE Trans. on Image Processing, Vol.33, pp. 27-41, 2023. https://doi.org/10.1109/TIP.2023.3330108
  14. [14] W. Verkruysse, L. O. Svaasand, and J. S. Nelson, “Remote plethysmographic imaging using ambient light,” Optics Express, Vol.16, Issue 26, pp. 21434-21445, 2008. https://doi.org/10.1364/OE.16.021434
  15. [15] R. Song, S. Zhang, J. Cheng, C. Li, and X. Chen, “New insights on super-high resolution for video-based heart rate estimation with a semi-blind source separation method,” Computers in Biology and Medicine, Vol.116, Article No.103535, 2020. https://doi.org/10.1016/j.compbiomed.2019.103535
  16. [16] M. Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE Trans. on Biomedical Engineering, Vol.58, Issue 1, pp. 7-11, 2010. https://doi.org/10.1109/TBME.2010.2086456
  17. [17] G. de Haan and V. Jeanne, “Robust pulse rate from chrominance-based rPPG,” IEEE Trans. on Biomedical Engineering, Vol.60, Issue 10, pp. 2878-2886, 2013. https://doi.org/10.1109/TBME.2013.2266196
  18. [18] W. Wang, A. C. den Brinker, S. Stuijk, and G. de Haan, “Algorithmic principles of remote PPG,” IEEE Trans. on Biomedical Engineering, Vol.64, Issue 7, pp. 1479-1491, 2016. https://doi.org/10.1109/TBME.2016.2609282
  19. [19] S. Jin, B. Yu, M. Jing, Y. Zhou, J. Liang, and R. Ji, “DarkVisionNet: Low-light imaging via RGB-NIR fusion with deep inconsistency prior,” Proc. of the AAAI Conf. on Artificial Intelligence, Vol.36, No.1, pp. 1104-1112, 2022. https://doi.org/10.1609/aaai.v36i1.19995
  20. [20] S. B. Park, G. Kim, H. J. Baek, J. H. Han, and J. H. Kim, “Remote pulse rate measurement from near-infrared videos,” IEEE Signal Processing Letters, Vol.25, Issue 8, pp. 1271-1275, 2018. https://doi.org/10.1109/LSP.2018.2842639
  21. [21] X. Niu, H. Han, S. Shan, and X. Chen, “SynRhythm: Learning a deep heart rate estimator from general to specific,” 2018 24th Int. Conf. on Pattern Recognition (ICPR), pp. 3580-3585, 2018. https://doi.org/10.1109/ICPR.2018.8546321
  22. [22] Y. Qiu, Y. Liu, J. Arteaga-Falconi, H. Dong, and A. El Saddik, “EVM-CNN: Real-time contactless heart rate estimation from facial video,” IEEE Trans. on Multimedia, Vol.21, Issue 7, pp. 1778-1787, 2019. https://doi.org/10.1109/TMM.2018.2883866
  23. [23] Z. Yu, X. Li, and G. Zhao, “Remote photoplethysmograph signal measurement from facial videos using spatio-temporal networks,” arXiv preprint, arXiv:1905.02419, 2019. https://doi.org/10.48550/arXiv.1905.02419
  24. [24] W. Chen and D. McDuff, “DeepPhys: Video-based physiological measurement using convolutional attention networks,” Proc. of the European Conf. on Computer Vision (ECCV 2018), Vol.11206, pp. 356-373, 2018. https://doi.org/10.1007/978-3-030-01216-8_22
  25. [25] J. Allen, H. Liu, S. Iqbal, D. Zheng, and G. Stansby, “Deep learning-based photoplethysmography classification for peripheral arterial disease detection: A proof-of-concept study,” Physiological Measurement, Vol.42, No.5, Article No.054002, 2021. https://doi.org/10.1088/1361-6579/abf9f3
  26. [26] M. Jabberi, A. Wali, B. B. Chaudhuri, and A. M. Alimi, “68 landmarks are efficient for 3D face alignment: What about more? 3D face alignment method applied to face recognition,” Multimed. Tools Appl., Vol.82, pp. 41435-41469, 2023. https://doi.org/10.1007/s11042-023-14770-x
  27. [27] B. Li, P. Zhang, J. Peng, and H. Fu, “Non-contact PPG signal and heart rate estimation with multi-hierarchical convolutional network,” Pattern Recognition, Vol.139, Article No.109421, 2023. https://doi.org/10.1016/j.patcog.2023.109421
  28. [28] M. Yuan, X. Shi, N. Wang, Y. Wang, and X. Wei, “Improving RGB-infrared object detection with cascade alignment-guided transformer,” Information Fusion, Vol.105, Article No.102246, 2024. https://doi.org/10.1016/j.inffus.2024.102246
  29. [29] M. Venkateswarlu and V. R. R. Ch, “DrowsyDetectNet: Driver Drowsiness Detection Using Lightweight CNN with Limited Training Data,” IEEE Access, Vol.12, pp. 110476-110491, 2024. https://doi.org/10.1109/ACCESS.2024.3440585
  30. [30] W. Mellouk and W. Handouzi, “CNN-LSTM for automatic emotion recognition using contactless hotoplethysmographic signals,” Biomedical Signal Processing and Control, Vol.85, Article No.104907, 2023. https://doi.org/10.1016/j.bspc.2023.104907
  31. [31] P. Li, Y. Huang, and K. Yao, “Multi-algorithm fusion of RGB and HSV color spaces for image enhancement,” 37th Chinese Control Conf. (CCC), pp. 9584-9589, 2018. https://doi.org/10.23919/ChiCC.2018.8483674
  32. [32] W. Halim and A. E. Haryono, “Analysis of Drowsiness with Karolinska Sleepiness Scale and Heart Rate while Driving with Three Stage Road Difficulty Using Driving Simulator,” OPSI, Vol.15, No.1, pp. 77-84, 2022. https://doi.org/10.31315/opsi.v15i1.6757
  33. [33] M. Patel, S. K. L. Lal, D. Kavanagh, and P. Rossiter, “Applying neural network analysis on heart rate variability data to assess driver fatigue,” Expert systems with Applications, Vol.38, Issue 6, pp. 7235-7242, 2011. https://doi.org/10.1016/j.eswa.2010.12.028
  34. [34] S. Chen, K. Xu, X. Zheng, J. Li, B. Fan, X. Yao, and Z. Li, “Linear and nonlinear analyses of normal and fatigue heart rate variability signals for miners in high-altitude and cold areas,” Computer Methods and Programs in Biomedicine, Vol.196, Article No.105667, 2020. https://doi.org/10.1016/j.cmpb.2020.105667
  35. [35] C. Ahlström and A. Anund, “Development of sleepiness in professional truck drivers: Real-road testing for driver drowsiness and attention warning (DDAW) system evaluation,” J. of Sleep Research, Article No.e14259, 2024. https://doi.org/10.1111/jsr.14259
  36. [36] X. Li, I. Alikhani, J. Shi, T. Seppanen, J. Junttila, K. Majamaa-Voltti, M. Tulppo, and G. Zhao, “The OBF database: A large face video database for remote physiological signal measurement and atrial fibrillation detection,” 2018 13th IEEE Int. Conf. on Automatic Face & Gesture Recognition (FG 2018), pp. 242-249, 2018. https://doi.org/10.1109/FG.2018.00043
  37. [37] E. Yang and O. Yi, “Enhancing road safety: Deep learning-based intelligent driver drowsiness detection for advanced driver-assistance systems,” Electronics, Vol.13, Issue 4, Article No.708, 2024. https://doi.org/10.3390/electronics13040708

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 24, 2025