Modeling of Human Welders' Operations in Virtual Reality Human-Robot Interaction

2019 
This letter presents a virtual reality (VR) human–robot interaction welding system that allows human welders to manipulate a welding robot and undertake welding tasks naturally and intuitively via consumer-grade VR hardware (HTC Vive). In this system, human welders’ operations are captured by motion-tracked handle controllers and used as commands to teleoperate a 6-DoF industrial robot (UR-5) and to request welding current from a controllable welding power supply (Liburdi Pulsweld P200). The three-dimensional (3-D) working scene is rendered in real time based on feedback information and shown to the human welder by head-mounted display via a motion-tracked headset. To compensate for the time delay between command motion and real motion of the robot, a hidden Markov model is proposed to model and predict human welders’ operations. The K-means clustering algorithm is applied to cluster human welders’ operation data (traveling speed) into latent states. Based on the developed prediction algorithm, the motion of human welders is predicted with an root mean square error (RMSE) accuracy of between 2.1 and 4.6 mm/s. The position data used as final commands to teleoperate a robot are predicted with an RMSE accuracy of between 1.1 and 2.3 mm. This letter presents a general cyber-physical model for human–robot interactive welding based on VR, building a foundation for welding robot teleoperation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    9
    Citations
    NaN
    KQI
    []