Decoding Lip Movements During Continuous Speech using Electrocorticography.
2019
Recent work has shown that it is possible to decode aspects of continuously-spoken speech from electrocorticographic (ECoG) signals recorded on the cortical surface. The ultimate objective is to develop a speech neuroprosthetic that can provide seamless, real-time synthesis of continuous speech directly from brain activity. Instead of decoding acoustic properties or classes of speech, such a neuroprosthetic might be realized by decoding articulator movements associated with speech production, as recent work highlights a representation of articulator movement in ECoG signals. The aim of this work is to investigate the neural correlates of speech-related lip movements from video recordings. We present how characteristics of lip movement can be decoded and lip-landmark positions can be predicted.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
1
Citations
NaN
KQI