Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Feb 3;18(2):456.
doi: 10.3390/s18020456.

Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor

Affiliations

Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor

Rizwan Ali Naqvi et al. Sensors (Basel). .

Abstract

A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.

Keywords: NIR camera sensor; deep learning; driver attention; eye gaze tracking; user calibration.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Flowchart of proposed system.
Figure 2
Figure 2
Experimental environment and proposed gaze detection system in vehicle environment.
Figure 3
Figure 3
17 gaze zones in our experiments. (a) 17 gaze zones; (b) Captured images when the driver looked at each gaze zone.
Figure 4
Figure 4
Example of detected facial landmarks.
Figure 5
Figure 5
Example of detected pupil and CR regions.
Figure 6
Figure 6
The PCCR vectors generated from (a) left eye and (b) right eye images.
Figure 7
Figure 7
CNN-based driver’s gaze classification procedure without PCCR vector (scheme 1).
Figure 8
Figure 8
CNN-based driver’s gaze classification procedure with PCCR vector (scheme 2).
Figure 9
Figure 9
CNN architecture used for finding required gaze features.
Figure 10
Figure 10
Example images of face (left), left eye (middle), and right eye (right) while looking at different regions of Figure 3. Cases of looking at (a) region 1; (b) region 2; (c) region 5; (d) region 6; (e) region 7; and (f) region 8.
Figure 10
Figure 10
Example images of face (left), left eye (middle), and right eye (right) while looking at different regions of Figure 3. Cases of looking at (a) region 1; (b) region 2; (c) region 5; (d) region 6; (e) region 7; and (f) region 8.
Figure 11
Figure 11
Augmented images obtained from the original ROI image.
Figure 12
Figure 12
The curves of training loss and training accuracies according to the number of epoch with the sub-databases of (a) face; (b) left eye; and (c) right eye. In (ac), left and right figures respectively show the graphs from the training of first- and second-fold cross validations.
Figure 12
Figure 12
The curves of training loss and training accuracies according to the number of epoch with the sub-databases of (a) face; (b) left eye; and (c) right eye. In (ac), left and right figures respectively show the graphs from the training of first- and second-fold cross validations.
Figure 13
Figure 13
Correctly detected gaze zones with our system. Left, middle, and right figures respectively show the cases that the driver looks at gaze zones of (a) 1, 6, and 14; (b) 15, 2, and 7; (c) 3, 8, and 16.
Figure 14
Figure 14
Incorrectly detected gaze zones with our system. Left, middle, and right figures respectively show the cases that the driver looks at gaze zones of (a) 1, 7, and 14; (b) 15, 2, and 8; (c) 9, 3, and 16.
Figure 14
Figure 14
Incorrectly detected gaze zones with our system. Left, middle, and right figures respectively show the cases that the driver looks at gaze zones of (a) 1, 7, and 14; (b) 15, 2, and 8; (c) 9, 3, and 16.
Figure 15
Figure 15
Eye gaze and head pose images selected from CAVE-DB.
Figure 16
Figure 16
Correctly detected gaze zones with our system on CAVE-DB. Left, middle, and right figures respectively show the cases that the user looks at gaze zones of (a) 1, 4, and 7; (b) 2, 5, and 8; (c) 3, 6, and 9, respectively, of Figure 15.
Figure 17
Figure 17
Incorrectly detected gaze zones with our system on CAVE-DB. Left, middle, and right figures respectively show the cases that user looks at gaze zones of (a) 1, 4, and 7; (b) 2, 5, and 8; (c) 3, 6, and 9, respectively, of Figure 15.

References

    1. Ji Q., Yang X. Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance. Real-Time Imaging. 2002;8:357–377. doi: 10.1006/rtim.2002.0279. - DOI
    1. Dong Y., Hu Z., Uchimura K., Murayama N. Driver Inattention Monitoring System for Intelligent Vehicles: A Review. IEEE Trans. Intell. Transp. Syst. 2011;12:596–614. doi: 10.1109/TITS.2010.2092770. - DOI
    1. National Center for Statistics and Analysis . Distracted Driving 2014, Traffic Safety Facts. National Highway Traffic Safety Administration’s National Center for Statistics and Analysis, U.S. Department of Transportation; Washington, DC, USA: 2016.
    1. Fitch G.A., Soccolich S.A., Guo F., McClafferty J., Fang Y., Olson R.L., Perez M.A., Hanowski R.J., Hankey J.M., Dingus T.A. The Impact of Hand-Held and Hands-Free Cell Phone Use on Driving Performance and Safety-Critical Event Risk. National Highway Traffic Safety Administration; Washington, DC, USA: 2013. Technical Report (Report No. DOT HS 811 757)
    1. Coughlin J.F., Reimer B., Mehler B. Monitoring, Managing, and Motivating Driver Safety and Well-Being. IEEE Pervasive Comput. 2011;10:14–21. doi: 10.1109/MPRV.2011.54. - DOI