Visual search and location probability learning from variable perspectives

Yuhong V. Jiang, Khena M. Swallow, Christian G. Capistrano

    Research output: Contribution to journalArticlepeer-review

    22 Scopus citations

    Abstract

    Do moving observers code attended locations relative to the external world or relative to themselves? To address this question we asked participants to conduct visual search on a tabletop. The search target was more likely to occur in some locations than others. Participants walked to different sides of the table from trial to trial, changing their perspective. The high-probability locations were stable on the tabletop but variable relative to the viewer. When participants were informed of the high-probability locations, search was faster when the target was in those locations, demonstrating probability cuing. However, in the absence of explicit instructions and awareness, participants failed to acquire an attentional bias toward the high-probability locations even when the search items were displayed over an invariant natural scene. Additional experiments showed that locomotion did not interfere with incidental learning, but the lack of a consistent perspective prevented participants from acquiring probability cuing incidentally. We conclude that spatial biases toward target-rich locations are directed by two mechanisms: incidental learning and goal-driven attention. Incidental learning codes attended locations in a viewer-centered reference frame and is not updated with viewer movement. Goal-driven attention can be deployed to prioritize an environment-rich region.

    Original languageEnglish (US)
    Article number13
    JournalJournal of vision
    Volume13
    Issue number6
    DOIs
    StatePublished - 2013

    Keywords

    • Attention
    • Incidental learning
    • Spatial reference frame
    • Visual search

    Fingerprint Dive into the research topics of 'Visual search and location probability learning from variable perspectives'. Together they form a unique fingerprint.

    Cite this