Augmented reality integration of fused LiDAR and spatial mapping

2018 
Fusing 3-D generated scenes from multiple, spatially distributed sensors produces a higher quality data product with fewer shadows or islands in the data. As an example, while airborne LiDAR systems scan the exterior of a structure, a spatial mapping system generates a high resolution scan of the interior. Fusing the exterior and interior scanned data streams allows the construction of a fully realized 3D representation of the environment by asserting an absolute reference frame. The implementation of this fused system allows simultaneous real-time streaming of point clouds from multiple assets, tracking of personnel and assets in that fused 3D space, and visualizing it on a mixed-reality device. Several challenges that were solved: 1) the tracking and synchronization of multiple independent assets; 2) identification of the network throughput for large data sets; 3) the coordinate transformation of collected point cloud data to a common reference; and 4) the fused representation of all collected data. We leveraged our advancements in real-time point cloud processing to allow a user to view the singular fused 3D image on a HoloLens. The user is also able to show or hide the fused features of the image as well as alter it in six degrees of freedom and scaling. This fused 3D image allows a user to see a virtual representation of their immediate surroundings or allow remote users to gain knowledge of a distant location.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    1
    References
    1
    Citations
    NaN
    KQI
    []