Humans Can Integrate Augmented Reality Feedback in Their Sensorimotor Control of a Robotic Hand

2017 
Tactile feedback is pivotal for grasping and manipulation in humans. Providing functionally effective sensory feedback to prostheses users is an open challenge. Past paradigms were mostly based on vibro- or electrotactile stimulations. However, the tactile sensitivity on the targeted body parts (usually the forearm) is greatly less than that of the hand/fingertips, restricting the amount of information that can be provided through this channel. Visual feedback is the most investigated technique in motor learning studies, where it showed positive effects in learning both simple and complex tasks; however, it was not exploited in prosthetics due to technological limitations. Here, we investigated if visual information provided in the form of augmented reality (AR) feedback can be integrated by able-bodied participants in their sensorimotor control of a pick-and-lift task while controlling a robotic hand. For this purpose, we provided visual continuous feedback related to grip force and hand closure to the participants. Each variable was mapped to the length of one of the two ellipse axes visualized on the screen of wearable single-eye display AR glasses. We observed changes in behavior when subtle (i.e., not announced to the participants) manipulation of the AR feedback was introduced, which indicated that the participants integrated the artificial feedback within the sensorimotor control of the task. These results demonstrate that it is possible to deliver effective information through AR feedback in a compact and wearable fashion. This feedback modality may be exploited for delivering sensory feedback to amputees in a clinical scenario.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    30
    Citations
    NaN
    KQI
    []