Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor
- PMID: 29401681
- PMCID: PMC5855991
- DOI: 10.3390/s18020456
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor
Abstract
A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.
Keywords: NIR camera sensor; deep learning; driver attention; eye gaze tracking; user calibration.
Conflict of interest statement
The authors declare no conflict of interest.
Figures




















References
-
- Ji Q., Yang X. Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance. Real-Time Imaging. 2002;8:357–377. doi: 10.1006/rtim.2002.0279. - DOI
-
- Dong Y., Hu Z., Uchimura K., Murayama N. Driver Inattention Monitoring System for Intelligent Vehicles: A Review. IEEE Trans. Intell. Transp. Syst. 2011;12:596–614. doi: 10.1109/TITS.2010.2092770. - DOI
-
- National Center for Statistics and Analysis . Distracted Driving 2014, Traffic Safety Facts. National Highway Traffic Safety Administration’s National Center for Statistics and Analysis, U.S. Department of Transportation; Washington, DC, USA: 2016.
-
- Fitch G.A., Soccolich S.A., Guo F., McClafferty J., Fang Y., Olson R.L., Perez M.A., Hanowski R.J., Hankey J.M., Dingus T.A. The Impact of Hand-Held and Hands-Free Cell Phone Use on Driving Performance and Safety-Critical Event Risk. National Highway Traffic Safety Administration; Washington, DC, USA: 2013. Technical Report (Report No. DOT HS 811 757)
-
- Coughlin J.F., Reimer B., Mehler B. Monitoring, Managing, and Motivating Driver Safety and Well-Being. IEEE Pervasive Comput. 2011;10:14–21. doi: 10.1109/MPRV.2011.54. - DOI
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous