Sensory Integration from an Impossible Source: Perceiving Simulated Faces
2017
Recent research has shown that aero-tactile cues influence speech perception without the presence of an acoustic signal (Bicevskis, Derrick & Gick, 2016); when participants viewed a bilabial articulation that co-occurred with a puff of air felt on the skin, they were significantly more likely to perceive it as aspirated. These results and others (Gick & Derrick, 2009, etc.) suggest that this integration is relatively automatic, enough so that it does not require the physical presence of the source to arise. However, it may be that perceivers are willing to extend physical capabilities to these non-present sources because they are human and therefore possible sources of the aero-tactile cue. The current study examines whether aero-tactile information from an impossible source—a computer-animated face on a computer monitor—can affect perception of aspirated consonants. Sixteen native English speakers are shown an animated video of a computer-animated head performing a bilabial plosive but hear only babble noise through headphones. Some of the presentations are accompanied by a light, synchronous puff of air on the neck. They are asked to identify the syllable as either /ba/ or /pa/. Analysis of this two-alternative forced choice response task will be presented. Evidence of integration from an impossible source would support the idea that visual-tactile integration is an automatic process that occurs even in the absence of an interlocutor capable of producing the stimuli.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
1
Citations
NaN
KQI