Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Mar 10:9:112.
doi: 10.3389/fnhum.2015.00112. eCollection 2015.

EEVEE: the Empathy-Enhancing Virtual Evolving Environment

Affiliations

EEVEE: the Empathy-Enhancing Virtual Evolving Environment

Philip L Jackson et al. Front Hum Neurosci. .

Abstract

Empathy is a multifaceted emotional and mental faculty that is often found to be affected in a great number of psychopathologies, such as schizophrenia, yet it remains very difficult to measure in an ecological context. The challenge stems partly from the complexity and fluidity of this social process, but also from its covert nature. One powerful tool to enhance experimental control over such dynamic social interactions has been the use of avatars in virtual reality (VR); information about an individual in such an interaction can be collected through the analysis of his or her neurophysiological and behavioral responses. We have developed a unique platform, the Empathy-Enhancing Virtual Evolving Environment (EEVEE), which is built around three main components: (1) different avatars capable of expressing feelings and emotions at various levels based on the Facial Action Coding System (FACS); (2) systems for measuring the physiological responses of the observer (heart and respiration rate, skin conductance, gaze and eye movements, facial expression); and (3) a multimodal interface linking the avatar's behavior to the observer's neurophysiological response. In this article, we provide a detailed description of the components of this innovative platform and validation data from the first phases of development. Our data show that healthy adults can discriminate different negative emotions, including pain, expressed by avatars at varying intensities. We also provide evidence that masking part of an avatar's face (top or bottom half) does not prevent the detection of different levels of pain. This innovative and flexible platform provides a unique tool to study and even modulate empathy in a comprehensive and ecological manner in various populations, notably individuals suffering from neurological or psychiatric disorders.

Keywords: FACS; affective computing; avatar; emotions; empathy; pain; virtual reality.

PubMed Disclaimer

Figures

Figure 1
Figure 1
This figure shows that the EEVEE platform works on the basis of an iterative loop between the observer and the avatar. The two protagonists, real and virtual, interact in this paradigm through expressions of their emotions. EEVEE project allows implementing new physiological measures functions depending of their relevance and availability. Currently, EEVEE focuses on Central Nervous System (CNS) and Autonomic Nervous System (ANS) responses via heart rate Beat per Minute (BPM), Electrocardiography (ECG), respiration (RESP), Electro-Dermal Activity (EDA), and Automatic Facial Expression Recognition (FACS). The sampling frequency depends on the physiological signal being processed, and varies from 10 Hz (BPM) to 2000 Hz (EEG). The main markers that are extracted include heart beat acceleration and deceleration (BPM), RR interval and their standard deviation (ECG), respiration acceleration, deceleration and apnea (RESP), skin conductance level and skin conductance response's amplitude and area under the curve (EDA), facial action unit intensities (FACS). All of these data are gathered by a system for physiological measurement (MP150, Biopac Systems Inc.) and the FaceReader™ software encodes FACS information.
Figure 2
Figure 2
Examples of avatars' emotional expression of emotions using the Facial Action Coding System principles. Top row (left to right): Fear, Sadness, Pain, Anger, Joy, Disgust. Middle row: Neutral. Bottom row (left to right): Joy, Disgust, Fear, Anger, Sadness, Pain.
Figure 3
Figure 3
Dynamic normal maps generated for animating avatars using the Facial Action Coding System principles.
Figure 4
Figure 4
EEVEE mirroring mode. (A) Raw live video feed, (B) mesh analysis and expression intensity levels from the FaceReader™ software, and (C) avatar mirroring of the facial expression generated by EEVEE.
Figure 5
Figure 5
Facial expressions of emotions displayed by one of the male avatars used in Experiment 1a. Top row (left to right): Neutral, Joy, Pain, Sadness. Bottom row (left to right): Neutral, Fear, Disgust, Anger.
Figure 6
Figure 6
This figure shows the significant interaction between within-subjects factors Pain Level (5 levels: A, B, C, D, E) and Stimuli Gender (2 levels: female, male) [F(4, 17) = 5.1, p = 0.001]. Asterisks mark pain levels for which the male avatar stimuli receive significantly higher pain evaluation than female avatar stimuli (5 post-hoc tests, unilateral Bonferroni corrected T-tests α = 0.01).
Figure 7
Figure 7
Stimuli used in Experiment 2 showing facial expressions of pain from one of the male avatars. All stimuli were presented in three conditions: No Mask (top row), Eyes Mask (middle row), and Mouth Mask (bottom row).
Figure 8
Figure 8
This figure shows the significant interaction between within-subject factors Model Type (2 levels: avatars, humans), Pain Level (3 levels: low pain, medium pain, high pain) and Mask (3 levels: no mask, mouth mask, eyes mask) [F(4, 140) = 6.41, p = 0.001]. Asterisks mark mask level for which evaluations at this pain level are significantly different from pain evaluation at other mask levels (18 post-hoc tests, with unilateral Bonferroni corrected T-tests α = 0.0028).

Similar articles

Cited by

References

    1. Achim A. M., Guitton M., Jackson P. L., Boutin A., Monetta L. (2013). On what ground do we mentalize? Characteristics of current tasks and sources of information that contribute to mentalizing judgments. Psychol. Assess. 25, 117–126. 10.1037/a0029137 - DOI - PubMed
    1. Bird G., Silani G., Brindley R., White S., Frith U., Singer T. (2010). Empathic brain responses in insula are modulated by levels of alexithymia but not autism. Brain 133, 1515–1525. 10.1093/brain/awq060 - DOI - PMC - PubMed
    1. Boggio P. S., Zaghi S., Fregni F. (2009). Modulation of emotions associated with images of human pain using anodal transcranial direct current stimulation (tDCS). Neuropsychologia 47, 212–217. 10.1016/j.neuropsychologia.2008.07.022 - DOI - PubMed
    1. Cheng Y., Lin C. P., Liu H. L., Hsu Y. Y., Lim K. E., Hung D., et al. . (2007). Expertise modulates the perception of pain in others. Curr. Biol. 17, 1708–1713. 10.1016/j.cub.2007.09.020 - DOI - PubMed
    1. Coll M. P., Grégoire M., Latimer M., Eugène F., Jackson P. L. (2011). Perception of pain in others: implication for caregivers. Pain Manag. 1, 257–265. 10.2217/pmt.11.21 - DOI - PubMed