The overall aim of this research is to reduce risky driving behaviour and driver distraction using innovative 3D Head Up Display (HUD) applications.
More specifically, this research will:
a) investigate how a new generation of 3D HUDs could be used to maintain a driver’s cognitive engagement on the primary driving task,
b) design and develop novel 3D HUD applications that improve driver’s behaviour in real-time, and
c) evaluate, in a naturalistic setting, the benefit of the new application, considering their visual implications, human factors and other safety implication side effects.
The investigation takes place in the context of SAE Level 3 automated driving. We are currently developing HUD applications that aim to engage drivers and optimise their scanning behaviour, situational awareness and hazard perception in SAE Level 3 automated driving scenarios. We will develop desirable or entertaining content (games, watching videos, etc.) that is:
a) displayed in such a way that it allows the driver/operator to observe traffic in the peripheral, and
b) embedding contextual information in the content delivery in such a way that it further contributes to a heightened situation and mode awareness.
Using SeeingMachines’ Driver Monitoring System (DMS), we are particularly interested in how to measure the level of engagement in the primary driving task and cognitive distraction from the driving task, as well as the overall fallback readiness of the driver.
Dr Ronald Schroeter and Michael Gerber won “Best Demonstration Award” at the 2018 Automotive User Interfaces (AutoUI) Conference in Toronto in September 2018.
Funding / Grants
- ARC Linkage (2015 - 2018)
Other Team Members
This project includes researchers from the QUT School of Optometry and Vision Science and the QUT School of Electrical Engineering and Computer Science.