EXPRESS: Updating Headings in 3D Navigation:

2020 
The current study investigated to what extent humans can encode spatial relations between different surfaces (i.e., floor, walls, and ceiling) in a 3D space and extend their headings on the floor to other surfaces when locomoting to walls (pitch 90○) and the ceiling (pitch 180○). In immersive virtual reality environments, participants first learned a layout of objects on the ground. They then navigated to testing planes: south (or north) walls facing Up, or the ceiling via walls facing North (or South). Participants locomoted to the walls with pitch rotations indicated by visual and idiothetic cues (Experiment 1) and only by visual cues (Experiment 2) and to the ceiling with visual pitch rotations only (Experiment 3). Using the memory of objects' locations, they either reproduced the object layout on the testing plane or did a Judgements of Relative Direction (JRD) task ("imagine standing at object A, facing B, point to C") with imagined headings of south and north on the ground. The results showed that participants who locomoted onto the wall with idiothetic cues showed a better performance in JRD for an imagined heading from which their physical heading was extended (e.g., imagined heading of North at the north wall). In addition, the participants who reproduced the layout of objects on the ceiling from a perspective extended from the ground also showed a sensorimotor alignment effect predicted by an extended heading. These results indicate that humans encode spatial relations between different surfaces and extend headings via pitch rotations three-dimensionally, especially with idiothetic cues.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []