Collaborative Semantic Understanding and Mapping Framework for Autonomous Systems

2020 
Performing collaborative semantic mapping is a critical challenge for cooperative robots to enhance their comprehensive contextual understanding of the surroundings. This paper bridges the gap between the advances in collaborative geometry mapping that relies on pure geometry information fusion, and single robot semantic mapping that focuses on integrating continuous raw sensor data. In this paper, a novel hierarchical collaborative probabilistic semantic mapping framework is proposed, where the problem is formulated in a distributed setting. The key novelty of this work is the modelling of the hierarchical semantic map fusion framework and its mathematical derivation of its probability decomposition. At the single robot level, the semantic point cloud is obtained by combining information from heterogeneous sensors and used to generate local semantic maps. At the collaborative robots level, local maps are shared among robots for global semantic map fusion. Since the voxel correspondence is unknown between local maps, an Expectation-Maximization approach is proposed to estimate the hidden data association. Then, Bayesian rule is applied to perform semantic and occupancy probability update. Experimental results on the UAV (Unmanned Aerial Vehicle) and the UGV (Unmanned Ground Vehicle) platforms show the high quality of global semantic maps, demonstrating the accuracy and utility in practical missions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    32
    References
    11
    Citations
    NaN
    KQI
    []