Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 May 26:3:64.
doi: 10.3389/fbioe.2015.00064. eCollection 2015.

Can a Humanoid Face be Expressive? A Psychophysiological Investigation

Affiliations

Can a Humanoid Face be Expressive? A Psychophysiological Investigation

Nicole Lazzeri et al. Front Bioeng Biotechnol. .

Abstract

Non-verbal signals expressed through body language play a crucial role in multi-modal human communication during social relations. Indeed, in all cultures, facial expressions are the most universal and direct signs to express innate emotional cues. A human face conveys important information in social interactions and helps us to better understand our social partners and establish empathic links. Latest researches show that humanoid and social robots are becoming increasingly similar to humans, both esthetically and expressively. However, their visual expressiveness is a crucial issue that must be improved to make these robots more realistic and intuitively perceivable by humans as not different from them. This study concerns the capability of a humanoid robot to exhibit emotions through facial expressions. More specifically, emotional signs performed by a humanoid robot have been compared with corresponding human facial expressions in terms of recognition rate and response time. The set of stimuli included standardized human expressions taken from an Ekman-based database and the same facial expressions performed by the robot. Furthermore, participants' psychophysiological responses have been explored to investigate whether there could be differences induced by interpreting robot or human emotional stimuli. Preliminary results show a trend to better recognize expressions performed by the robot than 2D photos or 3D models. Moreover, no significant differences in the subjects' psychophysiological state have been found during the discrimination of facial expressions performed by the robot in comparison with the same task performed with 2D photos and 3D models.

Keywords: affective computing; emotion perception; expression recognition; facial expressions; humanoid robot; psychophysiological signals; social robots.

PubMed Disclaimer

Figures

Figure 1
Figure 1
(A) AU positions mapped on the robot; (B) major facial muscles involved in the facial expressions; and (C) servo motor positions corresponding to the Aus.
Figure 2
Figure 2
2D photos and 3D models used in the experiment: (A) FACE expressions and (B) human expressions.
Figure 3
Figure 3
(A) Recognition rates (in percentage) and (B) response time (in seconds) of human 2D photos, human 3D models, and robot FACE expressions.
Figure 4
Figure 4
(A) Recognition rates (in percentage) and (B) response time (in seconds) of robot 2D photos, robot 3D models, and robot FACE expressions.
Figure 5
Figure 5
(A) Recognition rates (in percentage) and (B) response time (in seconds) of positive/negative human expressions.
Figure 6
Figure 6
(A) Recognition rates (in percentage) and (B) response time (in seconds) of positive/negative robot expressions.
Figure 7
Figure 7
Recognition rates (in percentage) of the FACE expressions.
Figure 8
Figure 8
Statistical analysis of two features extracted from HRV and SCR during the interpretation task based on human 2D photos, human 3D models, and the physical robot. (A) HRV results. Example of an intra-subject (subject 1) and an inter-subject statistical analysis result. The mean RR feature represents the mean value of the RR distance (ms). (B) SCR results. Example of an intra-subject (subject 1) and an inter-subject statistical analysis result. The mean-phasic feature represents the mean value of the SCR signal (uSiemens).

References

    1. Adolphs R. (2002). Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav. Cogn. Neurosci. Rev. 1, 21–62.10.1177/1534582302001001003 - DOI - PubMed
    1. Ambadar Z., Schooler J. W., Cohn J. F. (2005). Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions. Psychol. Sci. 16, 403–410.10.1111/j.0956-7976.2005.01548.x - DOI - PubMed
    1. Andreassi J. L. (2000). Psychophysiology: Human Behavior and Physiological Response, 4th Edn Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
    1. Bartneck C., Kanda T., Ishiguro H., Hagita N. (2007). “Is the uncanny valley an uncanny cliff?,” in The 16th IEEE International Symposium on Robot and Human interactive Communication (RO-MAN 2007) (Jeju Island: IEEE), 368–373.
    1. Bartneck C., Reichenbach J., Van Breemen A. (2004). “In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions,” in Design and Emotion Conference (Ankara: IEEE), 2004.

LinkOut - more resources