A 64mW DNN-based Visual Navigation Engine for Autonomous Nano-Drones

2019 
Fully-autonomous miniaturized robots (e.g., drones), with artificial intelligence (AI) based visual navigation capabilities, are extremely challenging drivers of Internet-of-Things edge intelligence capabilities. Visual navigation based on AI approaches, such as deep neural networks (DNNs) are becoming pervasive for standard-size drones, but are considered out of reach for nano-drones with a size of a few . In this work, we present the first (to the best of our knowledge) demonstration of a navigation engine for autonomous nano-drones capable of closed-loop end-to-end DNN-based visual navigation. To achieve this goal we developed a complete methodology for parallel execution of complex DNNs directly on board resource-constrained milliwatt-scale nodes. Our system is based on GAP8, a novel parallel ultra-low-power computing platform, and a 27 commercial, open-source CrazyFlie 2.0 nano-quadrotor. As part of our general methodology, we discuss the software mapping techniques that enable the state-of-the-art deep convolutional neural network presented in dronet to be fully executed aboard within a strict real-time constraint with no compromise in terms of flight results, while all processing is done with only on average. Our navigation engine is flexible and can be used to span a wide performance range: at its peak performance corner, it achieves while still consuming on average just of the power envelope of the deployed nano-aircraft. To share our key findings with the embedded and robotics communities and foster further developments in autonomous nano-UAVs, we publicly release all our code, datasets, and trained networks.
    • Correction
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []