Mental scanning of sonifications reveals flexible encoding of nonspeech sounds and a universal per-item scanning cost
2011
Abstract A mental scanning paradigm was used to examine the representation of nonspeech sounds in working memory. Participants encoded sonifications – nonspeech auditory representations of quantitative data – as either verbal lists, visuospatial images, or auditory images. The number of tones and overall frequency changes in the sonifications were also manipulated to allow for different hypothesized patterns of reaction times across encoding strategies. Mental scanning times revealed different patterns of reaction times across encoding strategies, despite the fact that all internal representations were constructed from the same nonspeech sound stimuli. Scanning times for the verbal encoding strategy increased linearly as the number of items in the verbal representation increased. Scanning times for the visuospatial encoding strategy were generally slower and increased as the metric distance (derived metaphorically from frequency change) in the mental image increased. Scanning times for the auditory imagery strategy were faster and closest to the veridical durations of the original stimuli. Interestingly, the number of items traversed in scanning a representation significantly affected scanning times across all encoding strategies. Results suggested that nonspeech sounds can be flexibly represented, and that a universal per-item scanning cost persisted across encoding strategies. Implications for cognitive theory, the mental scanning paradigm, and practical applications are discussed.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
61
References
5
Citations
NaN
KQI