Autonomous UAV Navigation for active perception of targets in uncertain and cluttered environments

2020 
The use of Small Unmanned Aerial Vehicles (sUAVs) has grown exponentially owing to an increasing number of autonomous capabilities. Automated functions include the return to home at critical energy levels, collision avoidance, take-off and landing, and target tracking. However, the application of sUAVs in real-world and time-critical applications, such as Search and Rescue (SAR) is still limited. sUAV operators are usually designated to decide on the next sequence of navigation commands to interact with the environment using visual telemetry. However, the prolonged use of these interactive systems is claimed to produce fatigue and sensory overloads. In SAR applications, the overarching aim of autonomous sUAV navigation is the quick localisation, identification and quantification of victims to prioritise emergency response in affected zones. Nevertheless, target detection onboard a sUAV is challenging because of noise in the data, low image resolution, illumination conditions, and partial (or full) occlusion of the victims. This paper presents an autonomous Sequential Decision Process (SDP) for sUAV navigation that incorporates target detection uncertainty from vision-based cameras. The SDP is modelled as a Partially Observable Markov Decision Process (POMDP) and solved online using the Adaptive Belief Tree (ABT) algorithm. In particular, a detailed model of target detection uncertainty from deep learning-based models is shown. The presented formulation is tested under Software in the Loop (SITL) through Gazebo, Robot Operating System (ROS), and PX4 firmware. A Hardware in the Loop (HITL) implementation is also presented using an Intel Myriad Vision Processing Unit (VPU) device and ROS. Tests are conducted in a simulated SAR GPS-denied scenario, aimed to find a person at different levels of location and pose uncertainty.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    30
    References
    8
    Citations
    NaN
    KQI
    []