Analysis of Coordination Patterns between Gaze and Control in Human Spatial Search

2019 
Abstract Human spatial search combines visual search and motion control problems. Both have been investigated separately over decades, however, the coordination between visual search and motion control has not been investigated. Analyzing coordination of sensory-motor behavior through teleoperation could help improve understanding of human search strategies as well as autonomous search algorithms. This research proposes a novel approach to analyze the coordination between visual attention via gaze patterns and motion control. The approach is based on estimation of human operators’ 3D gaze using Gaussian mixture model (GMM), hidden Markov model (HMM) and sparse inverse covariance estimation (SICE). The analysis of the human experimental data demonstrates that fixation is used primarily to look at the target, smooth pursuit is coupled to robot rotation and used to search for new explored area, and saccade is coupled with forward motion and used to search for new explored area. These insights are used to build a functional model of human teleoperation search.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    3
    Citations
    NaN
    KQI
    []