The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory-only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities
- PMID: 37821750
- PMCID: PMC11289065
- DOI: 10.3758/s13428-023-02249-4
The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory-only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities
Abstract
We describe JAVMEPS, an audiovisual (AV) database for emotional voice and dynamic face stimuli, with voices varying in emotional intensity. JAVMEPS includes 2256 stimulus files comprising (A) recordings of 12 speakers, speaking four bisyllabic pseudowords with six naturalistic induced basic emotions plus neutral, in auditory-only, visual-only, and congruent AV conditions. It furthermore comprises (B) caricatures (140%), original voices (100%), and anti-caricatures (60%) for happy, fearful, angry, sad, disgusted, and surprised voices for eight speakers and two pseudowords. Crucially, JAVMEPS contains (C) precisely time-synchronized congruent and incongruent AV (and corresponding auditory-only) stimuli with two emotions (anger, surprise), (C1) with original intensity (ten speakers, four pseudowords), (C2) and with graded AV congruence (implemented via five voice morph levels, from caricatures to anti-caricatures; eight speakers, two pseudowords). We collected classification data for Stimulus Set A from 22 normal-hearing listeners and four cochlear implant users, for two pseudowords, in auditory-only, visual-only, and AV conditions. Normal-hearing individuals showed good classification performance (McorrAV = .59 to .92), with classification rates in the auditory-only condition ≥ .38 correct (surprise: .67, anger: .51). Despite compromised vocal emotion perception, CI users performed above chance levels of .14 for auditory-only stimuli, with best rates for surprise (.31) and anger (.30). We anticipate JAVMEPS to become a useful open resource for researchers into auditory emotion perception, especially when adaptive testing or calibration of task difficulty is desirable. With its time-synchronized congruent and incongruent stimuli, JAVMEPS can also contribute to filling a gap in research regarding dynamic audiovisual integration of emotion perception via behavioral or neurophysiological recordings.
Keywords: Adaptive testing; Audiovisual integration; Cochlear implant; Emotion; Emotion induction; Stimulus database; Voice morphing.
© 2023. The Author(s).
Conflict of interest statement
The authors report no conflicts of interest, financial, or otherwise.
Figures


Similar articles
-
The role of emotion in dynamic audiovisual integration of faces and voices.Soc Cogn Affect Neurosci. 2015 May;10(5):713-20. doi: 10.1093/scan/nsu105. Epub 2014 Aug 20. Soc Cogn Affect Neurosci. 2015. PMID: 25147273 Free PMC article.
-
Event-Related Potentials Reveal Evidence for Late Integration of Emotional Prosody and Facial Expression in Dynamic Stimuli: An ERP Study.Multisens Res. 2019 Jan 1;32(6):473-497. doi: 10.1163/22134808-20191332. Multisens Res. 2019. PMID: 31085752
-
Crossmodal benefits to vocal emotion perception in cochlear implant users.iScience. 2022 Dec 2;25(12):105711. doi: 10.1016/j.isci.2022.105711. eCollection 2022 Dec 22. iScience. 2022. PMID: 36578321 Free PMC article.
-
Understanding voice naturalness.Trends Cogn Sci. 2025 May;29(5):467-480. doi: 10.1016/j.tics.2025.01.010. Epub 2025 Feb 25. Trends Cogn Sci. 2025. PMID: 40011186 Review.
-
Audiovisual Integration in Nonhuman Primates: A Window into the Anatomy and Physiology of Cognition.In: Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): CRC Press/Taylor & Francis; 2012. Chapter 5. In: Murray MM, Wallace MT, editors. The Neural Bases of Multisensory Processes. Boca Raton (FL): CRC Press/Taylor & Francis; 2012. Chapter 5. PMID: 22593884 Free Books & Documents. Review.
Cited by
-
Non-verbal effecting - animal research sheds light on human emotion communication.Biol Rev Camb Philos Soc. 2025 Feb;100(1):245-257. doi: 10.1111/brv.13140. Epub 2024 Sep 11. Biol Rev Camb Philos Soc. 2025. PMID: 39262120 Free PMC article. Review.
References
-
- Ambadar, Z., Schooler, J. W., & Cohn, J. F. (2005). Deciphering the enigmatic face the importance of facial dynamics in interpreting subtle facial expressions. Psychological Science,16(5), 403–410. - PubMed
-
- Ambert-Dahan, E., Giraud, A. L., Mecheri, H., Sterkers, O., Mosnier, I., & Samson, S. (2017). Emotional recognition of dynamic facial expressions before and after cochlear implantation in adults with progressive deafness. Hearing Research,354, 64–72. - PubMed
-
- Bänziger, T., Mortillaro, M., & Scherer, K. R. (2012). Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion,12(5), 1161–1179. - PubMed
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous