Read What You Touch with Intelligent Audio System for Non-Visual Interaction

2016 
Slate-type devices allow Individuals with Blindness or Severe Visual Impairment (IBSVI) to read in place with the touch of their fingertip by audio-rendering the words they touch. Such technologies are helpful for spatial cognition while reading. However, users have to move their fingers slowly, or they may lose their place on screen. Also, IBSVI may wander between lines without realizing they did. We addressed these two interaction problems by introducing a dynamic speech-touch interaction model and an intelligent reading support system. With this model, the speed of the speech will dynamically change with the user’s finger speed. The proposed model is composed of (1) an Audio Dynamics Model and (2) an Off-line Speech Synthesis Technique. The intelligent reading support system predicts the direction of reading, corrects the reading word if the user drifts, and notifies the user using a sonic gutter to help him/her from straying off the reading line. We tested the new audio dynamics model, the sonic gutter, and the reading support model in two user studies. The participants’ feedback helped us fine-tune the parameters of the two models. A decomposition study was conducted to evaluate the main components of the system. The results showed that both intelligent reading support with tactile feedback are required to achieve the best performance in terms of efficiency and effectiveness. Finally, we ran an evaluation study where the reading support system is compared to other VoiceOver technologies. The results showed preponderance to the reading support system with its audio dynamics and intelligent reading support components.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    3
    Citations
    NaN
    KQI
    []