Cortical Tracking of Vocoded Speech Streams with a Competing Speaker Based on Attentional Selection.

2019 
The perception of attended speech with multiple competing speakers has been a longstanding challenge for cochlear implant (CI) users. Since brain has the ability to automatically track attended speech with auditory selective attention, this provides an effective strategy to incorporate listeners’ intention to improve the processing of attended speech stream for CI users in competing environments. In this study, two speech streams (two stories narrated by different genders) processed by a vocoder simulating CI speech processing were concurrently presented, and the subjects were demanded to pay attention to one target stream. Four conditions at 3 dB, 0 dB, -3 dB and -6 dB target-to-masker ratios were displayed to investigate the effect of the relative intensity of a competing speaker on the cortical tracking of attended vocoded speech. The electroencephalographic responses were measured and correlated with the slow amplitude fluctuations of the speech signals via a temporal response function. Results showed that the neural responses for attended speech stimuli differed with those for unattended stimuli, indicating the possibility to use robust cortical representation of attentional modulation to classify attended and unattended vocoded speech streams.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []