Robust Regrasping against Error of Grasping for Bin-picking and Kitting

2021 
This study proposes a method of robust regrasping an object using a dual-arm robot with general-purpose hands, which is robust against the error of grasping. In this paper, one arm is assigned to hand over the object to the other arm that is named a receiver arm. The grasping error must be considered to increase the success rate of the regrasping since a hand-over arm first picks up the object with the general-purpose hand. In an online phase, the proposed method performs object positioning at an optimal pose at the time of regrasping using an image-based visual servoing (IBVS) approach to reduce the effect of the grasping error. In the planning phase, the proposed method computes the optimal pose for regrasping by maximizing the minimum singular values of the image Jacobian of IBVS to achieve a high positioning accuracy using a 3D model of the target object. To achieve the regrasping objects with various shapes robustly against image noises and changes in light environments, the image Jacobian of IBVS is computed by numerical differential using an actual data set. A large number of data sets corresponding to each candidate grasp are usually required for computing the image Jacobian. To reduce the number of data sets, we propose a conversion method of the image Jacobian requiring only one data set corresponding to one representative grasp. The experimental results show that the proposed method achieves regrasping of target objects with the general-purpose hands with high success rates and performs target object positioning with less than 0.7[mm] positioning error.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []