Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Jul 20;15(7):17507-33.
doi: 10.3390/s150717507.

Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors

Affiliations

Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors

Jong-Suk Choi et al. Sensors (Basel). .

Abstract

Most previous research into emotion recognition used either a single modality or multiple modalities of physiological signal. However, the former method allows for limited enhancement of accuracy, and the latter has the disadvantages that its performance can be affected by head or body movements. Further, the latter causes inconvenience to the user due to the sensors attached to the body. Among various emotions, the accurate evaluation of fear is crucial in many applications, such as criminal psychology, intelligent surveillance systems and the objective evaluation of horror movies. Therefore, we propose a new method for evaluating fear based on nonintrusive measurements obtained using multiple sensors. Experimental results based on the t-test, the effect size and the sum of all of the correlation values with other modalities showed that facial temperature and subjective evaluation are more reliable than electroencephalogram (EEG) and eye blinking rate for the evaluation of fear.

Keywords: facial temperature; fear; nonintrusive multimodal measurement; subjective evaluation.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Flowchart of the experimental procedure of our research (BR is blinking rate and FT is facial temperature).
Figure 2
Figure 2
Proposed system for evaluating fear.
Figure 3
Figure 3
Dual (visible-light and thermal) cameras used in our method and their images.
Figure 4
Figure 4
Commercial EEG device and locations of 16 electrodes based on the international 10–20 system. (a) Emotiv EPOC headset; (b) positions of 16 electrodes.
Figure 5
Figure 5
Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (a) Four pairs of corresponding (calibration) points; (b) pair of points for measuring calibration accuracy.
Figure 6
Figure 6
(a) Detected face and facial feature regions in the visible-light image; (b) mapped face and facial feature regions in the thermal image after geometric transformation.
Figure 7
Figure 7
Example of defined ROIs used to measure the change of facial temperature.
Figure 8
Figure 8
Comparison of delta and beta waves before and after watching a horror movie. (a) Change in delta waves; (b) change in beta waves; (c) change in the ratio of delta to beta waves.
Figure 8
Figure 8
Comparison of delta and beta waves before and after watching a horror movie. (a) Change in delta waves; (b) change in beta waves; (c) change in the ratio of delta to beta waves.
Figure 9
Figure 9
Example of detecting pupil regions using sub-block-based template matching.
Figure 10
Figure 10
Example of determining whether eyes are open and closed. (a) Open eyes; (b) closed eyes.
Figure 11
Figure 11
Experiment for measuring the accuracy of the geometric transform. The top and bottom figures of (ac) are images from the visible-light and thermal cameras, respectively. The NIR illuminator is positioned at example positions: (a) Position 1, (b) Position 5 and (c) Position 9.
Figure 12
Figure 12
Experimental procedure for measuring fear (BR is blinking rate, FT is facial temperature and SE means subjective evaluation).
Figure 13
Figure 13
Comparison of subjective evaluation scores before and after watching the horror movie.
Figure 14
Figure 14
Comparisons of FTs of facial feature regions before and after watching the horror movie (FT is facial temperature).
Figure 15
Figure 15
Comparisons of eye blinking rate before watching the horror movie and in the last 1 min of watching the movie (BR is blinking rate).
Figure 16
Figure 16
Ratios of delta band to beta band of EEG data before and after watching the horror movie.
Figure 17
Figure 17
Comparison of subjective evaluation scores before and after watching the video clip of emotionally-neutral content to the subjects.
Figure 18
Figure 18
Comparisons of the facial temperature of facial feature regions before and after watching the video clip of emotionally-neutral content to the subjects (FT is facial temperature).
Figure 19
Figure 19
Comparisons of eye blinking rate before and in the last 1 min of watching the video clip of emotionally-neutral content to the subjects (BR is blinking rate).
Figure 20
Figure 20
Ratios of delta band to beta band of EEG data before and after watching the video clip of emotionally-neutral content to the subjects.

References

    1. Kwon D.-S., Kwak Y.K., Park J.C., Chung M.J., Jee E.-S., Park K.-S., Kim H.-R., Kim Y.-M., Park J.-C., Kim E.H., et al. Emotion interaction system for a service robot; Proceedings of the 16th IEEE International Conference on Robot and Human Interactive Communication; Jeju, Korea. 26–29 August 2007; pp. 351–356.
    1. Machot F.A., Mosa A.H., Dabbour K., Fasih A., Schwarzlmüller C., Ali M., Kyamakya K. A novel real-time emotion detection system from audio streams based on Bayesian quadratic discriminate classifier for ADAS; Proceedings of the 3rd International Workshop on Nonlinear Dynamics and Synchronization and 16th International Symposium on Theoretical Electrical Engineering; Klagenfurt, Austria. 25–27 July 2011; pp. 1–5.
    1. SHORE™. Object and Face Recognition. [(accessed on 15 March 2015)]. Available online: http://www.iis.fraunhofer.de/en/ff/bsy/tech/bildanalyse/shore-gesichtsde....
    1. Strupp S., Schmitz N., Berns K. Visual-based emotion detection for natural man-machine interaction. Lect. Notes Artif. Intell. 2008;5243:356–363.
    1. Sun Y., Sebe N., Lew M.S., Gevers T. Authentic emotion detection in real-time video. Lect. Notes Comput. Sci. 2004;3058:94–104.

Publication types