Object representations in visual cortex are scaled to account for viewing distance during visual search

2021 
Humans are remarkably proficient at finding objects within a complex visual world. It has been proposed that observers strategically increase their visual system's responsivity to any object of interest, by pre-activating a visual representation of the target object during search preparation. Despite being widely accepted, this mechanism fails to account for an inherent property of real-world vision: the image that any given object will project on the retinae is unknown, as it depends on the object’s eventual location. For instance, the color and shape of the retinal image are determined by the illumination and viewpoint on the object, and - most dramatically - its size can vary by orders of magnitude depending on the distance to the object. How can preparatory activity in visual cortex benefit search in the real world, where the retinal image of an object is context-dependent? We addressed this question by testing whether human observers generate visual object representations during search preparation, and scale those object representations to account for viewing distance. In two fMRI experiments, (N=58) participants were cued to search for real-world objects at different distances within naturalistic scenes. We measured BOLD responses following the onset of the scene, from which the viewing distance could be inferred, and analyzed a subset of trials in which - unexpectedly - no array of objects appeared. This allowed for isolating brain activity related to search preparation only. Using multivariate pattern analysis, we related the patterns of brain activity evoked during search preparation, to those evoked by viewing isolated objects of different sizes. The data show that (1) observers generate visual representations of their target object during search preparation in object-selective regions, and (2) scale these representations to flexibly account for search distance. These findings reconcile current theories on visual selection with the functional demands of real-world vision.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []