Data-driven Texture Modeling and Rendering on Electrovibration Display

2019 
With the introduction of variable friction displays, new possibilities have emerged in haptic texture rendering on flat surfaces. In this work, we propose a data-driven method for realistic texture rendering on an electrovibration display. We first describe a motorized linear tribometer designed to collect lateral frictional forces from textured surfaces under various scanning velocities and normal forces. We then propose an inverse dynamics model of the display to describe its output-input relationship using nonlinear autoregressive neural networks with external input. Forces resulting from applying a pseudo-random binary signal to the display are used to train each network under the given experimental condition. In addition, we propose a two-step interpolation scheme to estimate actuation signals for arbitrary conditions under which no prior data have been collected. A comparison between real and virtual forces in the frequency domain shows promising results for recreating virtual textures similar to the real ones, also revealing the capabilities and limitations of the proposed method. We also conducted a human user study to compare the performance of our neural-network-based method with that of a record-and-playback method. The results showed that the similarity between the real and virtual textures generated by our approach was significantly higher.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    68
    References
    8
    Citations
    NaN
    KQI
    []