Effects of different non-driving-related-task display modes on drivers’ eye-movement patterns during take-over in an automated vehicle

2020 
Abstract Highly automated vehicles relieve drivers from driving tasks, allowing them to engage in non-driving-related-tasks (NDRTs). However, drivers are required to take over control in certain circumstances due to the limitations of highly automated vehicles. This study focused on drivers’ eye-movement patterns during take-overs when an NDRT (watching videos) was presented via a head-up-display (HUD) and a mobile device display (MDD), compared to no NDRT as the baseline. The experiment was conducted in a high-fidelity driving simulator with real-world driving videos scenarios. Forty-six participants took part in the experiment by completing three drives in three counterbalanced conditions (HUD, MDD and baseline). A take-over-request was issued towards the end of automated driving requesting drivers to stop the NDRT and take over control. Eye-movement data including pupil diameter, blinks, glance duration and number of AOI (Area of Interest) were collected and analysed. The results show that during automated driving, drivers were more engaged in the MDD NDRT with smaller pupil diameter and shorter glance duration on the front scenario compared to the HUD and baseline modes. The number of AOI was reduced during automated driving in both MDD and HUD modes. The take-over-request redirected drivers’ visual attention back to the driving task from NDRT by increasing drivers’ pupil diameter, glance duration and number of AOI. However, the effect of MDD NDRT on pupil diameter and glance duration continued even after the take-over-request when the NDRT was terminated. The study demonstrated HUD is a better display to help maintain drivers’ attention on the road.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    15
    Citations
    NaN
    KQI
    []