Sensor prediction and grasp stability evaluation for in-hand manipulation

2013 
Handling objects with a single hand without dropping the object is challenging for a robot. A possible way to aid the motion planning is the prediction of the sensory results of different motions. Sequences of different movements can be performed as an offline simulation, and using the predicted sensory results, it can be evaluated whether the desired goal is achieved. In particular, the task in this paper is to roll a sphere between the fingertips of the dexterous hand of the humanoid robot TWENDY-ONE. First, a forward model for the prediction of the touch state resulting from the in-hand manipulation is developed. As it is difficult to create such a model analytically, the model is obtained through machine learning. To get real world training data, a dataglove is used to control the robot in a master-slave way. The learned model was able to accurately predict the course of the touch state while performing successful and unsuccessful in-hand manipulations. In a second step, it is shown that this simulated sequence of sensor states can be used as input for a stability assessment model. This model can accurately predict whether a grasp is stable or whether it results in dropping the object. In a final step, a more powerful grasp stability evaluator is introduced, which works for our task regardless of the sphere diameter.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    9
    Citations
    NaN
    KQI
    []