Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Oct 23:6:98.
doi: 10.3389/frobt.2019.00098. eCollection 2019.

AR DriveSim: An Immersive Driving Simulator for Augmented Reality Head-Up Display Research

Affiliations

AR DriveSim: An Immersive Driving Simulator for Augmented Reality Head-Up Display Research

Joseph L Gabbard et al. Front Robot AI. .

Abstract

Optical see-through automotive head-up displays (HUDs) are a form of augmented reality (AR) that is quickly gaining penetration into the consumer market. Despite increasing adoption, demand, and competition among manufacturers to deliver higher quality HUDs with increased fields of view, little work has been done to understand how best to design and assess AR HUD user interfaces, and how to quantify their effects on driver behavior, performance, and ultimately safety. This paper reports on a novel, low-cost, immersive driving simulator created using a myriad of custom hardware and software technologies specifically to examine basic and applied research questions related to AR HUDs usage when driving. We describe our experiences developing simulator hardware and software and detail a user study that examines driver performance, visual attention, and preferences using two AR navigation interfaces. Results suggest that conformal AR graphics may not be inherently better than other HUD interfaces. We include lessons learned from our simulator development experiences, results of the user study and conclude with limitations and future work.

Keywords: augmented reality; conformal graphics; driving simulator; head-up display; human machine interface.

PubMed Disclaimer

Figures

Figure 1
Figure 1
A bird's eye view of the Mini Cooper half cab with participant and experimenter. While the work presented herein focuses on CG-based driving capabilities, the testbed also supports alternative forms of driving studies (e.g., video-based).
Figure 2
Figure 2
The logical arrangements of 7 AR DriveSim screens with resolution and physical connections noted (A); experimenter's control station (B), and an annotated view from inside the Mini Cooper cab (C). Note that the AR HUD and Center Stack displays are connected to a separate computer dedicated to UI presentation.
Figure 3
Figure 3
The major computing components of our AR DriveSim communicate via UDP (yellow). A set of off-the-shelf and custom microcontrollers (brown) pass driving control inputs read from CAN bus (green) and other sensors to the AR DriveSim computer (orange). A control board (brown) further manages a DC motor to provide force feedback to the steering wheel. A separate computer (blue) renders 3D graphics on an AR HUD by synchronizing its virtual camera position in real-time with the AR DriveSim computer. A set of experimenter controls (black) assist in coordinating experiments.
Figure 4
Figure 4
A participant's view as they calibrate the AR HUD 3D graphics by viewing carefully placed 3D shapes and perspective lines overlaid onto a static road scene with known geometry.
Figure 5
Figure 5
The user study examined two HUD display conditions: a conformal arrow integrated into road scene (top), and, a screen-fixed arrow which filled as drivers approached turns (bottom). For each, the initial state of the graphic appeared at 392 feet from the interaction (left panels), and disappeared after participants traversed intersections (right panels).
Figure 6
Figure 6
Participants rated the NASA-TLX sub-scores on a scale of 0 (low demand) to 100 (high demand). The average of the sub-scores comprised the Raw TLX score.
Figure 7
Figure 7
Participants rated the following statements on a scale of 0 (strongly agree) to 100 (strongly disagree): (1) “I did not find this interface distracting.” (2) “Using this interface had a positive impact on my driving.” (3) “It was easy to navigate while using this interface.” (4) “I trusted the information on this interface.” and (5) “The interface was easy to view”.
Figure 8
Figure 8
The percentage of fixations allocated to each AOI differed between Conformal and Screen-fixed displays. While there is no ideal allocation across AOIs, it is interesting to note that the percentages vary, particularly between percentage of time spent looking at and around the HUD Graphic.

References

    1. Administration NHTS (2013). Visual-Manual NHTSA Driver Distraction Guidelines for In-vehicle Electronic Devices (No. NHTSA-2010-0053). Washington, DC: National Highway Traffic Safety Administration.
    1. Bolton A., Burnett G., Large D. R. (2015). An investigation of augmented reality presentations of landmark-based navigation using a head-up display, in Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Nottingham: ACM; ), 56–63.
    1. Bower M., Howe C., McCredie N., Robinson A., Grover D. (2014). Augmented reality in education–cases, places and potentials. Educ. Media Int. 51, 1–15. 10.1080/09523987.2014.889400 - DOI
    1. Caird J. K., Chisholm S. L., Lockhart J. (2008). Do in-vehicle advanced signs enhance older and younger drivers' intersection performance? Driving simulation and eye movement results. Int. J. Hum. Comp. Stud. 66, 132–144. 10.1016/j.ijhcs.2006.07.006 - DOI
    1. Charissis V., Papanastasiou S. (2010). Human-machine collaboration through vehicle head up display interface. Cogn. Technol. Work 12, 41–50. 10.1007/s10111-008-0117-0 - DOI