Maneuvers Under Estimation of Human Postures for Autonomous Navigation of Robot KUKA YouBot

2021 
We present the successful demonstration of the Autonomous navigation based on maneuvers under certain human positions for an omnidirectional KUKA YouBot robot. The integration of human posture detection and navigation capabilities in the robot was successfully accomplished thanks to the integration of the Robotic Operating System (ROS) and working environments of open source library of computer vision (OpenCV). The robotic operating system allows the implementation of algorithms on real time and simulated platforms, the open source library of computer vision allows the recognition of human posture signals through the use of the Faster R-CNN (regions with convolutional neural networks) deep learning approach, which for its application in OpenCV is translated to SURF (speeded up robust features), which is one of the most used algorithms for extracting points of interest in image recognition. The main contribution of this work is that the Estimation of Human Postures is a promise method in order to provide intelligence in Autonomous Navigation of Robot KUKA YouBot due to the fact that the Robot learn from the human postures and it is capable of perform a desired task during the execution of navigation or any other activity.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    13
    References
    0
    Citations
    NaN
    KQI
    []