Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Mar 10;24(6):1781.
doi: 10.3390/s24061781.

Exploring Gaze Dynamics in Virtual Reality through Multiscale Entropy Analysis

Affiliations

Exploring Gaze Dynamics in Virtual Reality through Multiscale Entropy Analysis

Sahar Zandi et al. Sensors (Basel). .

Abstract

This study employs Multiscale Entropy (MSE) to analyze 5020 binocular eye movement recordings from 407 college-aged participants, as part of the GazeBaseVR dataset, across various virtual reality (VR) tasks to understand the complexity of user interactions. By evaluating the vertical and horizontal components of eye movements across tasks such as vergence, smooth pursuit, video viewing, reading, and random saccade, collected at 250 Hz using an ET-enabled VR headset, this research provides insights into the predictability and complexity of gaze patterns. Participants were recorded up to six times over a 26-month period, offering a longitudinal perspective on eye movement behavior in VR. MSE's application in this context aims to offer a deeper understanding of user behavior in VR, highlighting potential avenues for interface optimization and user experience enhancement. The results suggest that MSE can be a valuable tool in creating more intuitive and immersive VR environments by adapting to users' gaze behaviors. This paper discusses the implications of these findings for the future of VR technology development, emphasizing the need for intuitive design and the potential for MSE to contribute to more personalized and comfortable VR experiences.

Keywords: eye movements; human sensing; multiscale entropy; time series analysis; user experience; virtual reality.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
Flowchart illustrating the MSE calculation process with included equations.
Figure 2
Figure 2
Multiscale Entropy analysis of eye movements in the VRG Task. (a) Horizontal and (b) Vertical Gaze Position. The VRG task involves vergence eye movements, where participants adjust their gaze to focus on objects at varying depths, simulating a 3D environment in VR.
Figure 2
Figure 2
Multiscale Entropy analysis of eye movements in the VRG Task. (a) Horizontal and (b) Vertical Gaze Position. The VRG task involves vergence eye movements, where participants adjust their gaze to focus on objects at varying depths, simulating a 3D environment in VR.
Figure 3
Figure 3
Multiscale Entropy analysis of eye movements in the VID Task. (a) Horizontal and (b) Vertical Gaze Position. The VID task involves participants watching videos, a common activity in VR, focusing on understanding how gaze behavior changes during passive viewing.
Figure 4
Figure 4
Multiscale Entropy analysis of eye movements in the PUR task. (a) Horizontal and (b) Vertical Gaze Position. In the PUR task, subjects engage in smooth pursuit movements, following moving objects with their eyes, which is essential for tracking moving stimuli in VR.
Figure 5
Figure 5
Multiscale Entropy analysis of eye movements in the TEX task. (a) Horizontal and (b) Vertical Gaze Position. The TEX task represents reading text in VR, a scenario that involves distinct eye movement patterns due to the linear nature of text and frequent line shifts.
Figure 6
Figure 6
Multiscale Entropy analysis of eye movements in the RAN task. (a) Horizontal and (b) Vertical Gaze Position. In the RAN task, participants perform random saccades, simulating unpredictable eye movements as they might occur in a dynamic and unstructured VR environment.

Similar articles

Cited by

References

    1. Billinghurst M., Clark A., Lee G. A survey of augmented reality. Found. Trends Hum. Comput. Interact. 2015;8:73–272. doi: 10.1561/1100000049. - DOI
    1. Rizzo A., Koenig S.T. Is clinical virtual reality ready for primetime? Neuropsychology. 2017;31:877. doi: 10.1037/neu0000405. - DOI - PubMed
    1. Clay V., König P., Koenig S. Eye tracking in virtual reality. J. Eye Mov. Res. 2019;12 doi: 10.16910/jemr.12.1.3. - DOI - PMC - PubMed
    1. Mihelj M., Novak D., Beguš S. Virtual Reality Technology and Applications. Springer; Dordrecht, The Netherlands: 2014.
    1. Kaewrat C., Boonbrahm P., Sahoh B. The Design and Development of a Foot-Detection Approach Based on Seven-Foot Dimensions: A Case Study of a Virtual Try-On Shoe System Using Augmented Reality Techniques. Informatics. 2023;10:48. doi: 10.3390/informatics10020048. - DOI

LinkOut - more resources