A multimodal dataset for authoring and editing multimedia content: The MAMEM project
- PMID: 29204464
- PMCID: PMC5709300
- DOI: 10.1016/j.dib.2017.10.072
A multimodal dataset for authoring and editing multimedia content: The MAMEM project
Abstract
We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.
Figures
References
-
- Nikolopoulos S. 2017. The Mamem Project - A Dataset For Multimodal Human-Computer Interaction Using Biosignals And Eye Tracking Information.
-
- C. Kumar, R. Menges, M. Daniel, S. Staab, Chromium based Framework to Include Gaze Interaction in Web Browser, in: Proceedings of the 26th International Conference on World Wide Web Companion, pp. 219–223, 2017.
-
- Kumar C., Menges R., Staab S. Eye-controlled interfaces for multimedia interaction. IEEE Multimed. 2016;23(4):6–13.
Associated data
LinkOut - more resources
Full Text Sources
Other Literature Sources