Near-Infrared, Depth, Material: Towards a Trimodal Time-of-Flight Camera

2021 
Time-of-Flight cameras are active sensors that are able to capture both the light intensity reflected by each observed point in the scene and the distance between these points and the camera. Enhancing intensity images with a depth modality enables capturing surfaces in 3D and boosts the applicability of these sensors. Nevertheless, high-level information still needs to be extracted from the data stream in order to accomplish high-level tasks, like recognition or classification. Ideally, the semantic gap between sensor output and high-level requirements should be as small as possible, in order to reduce both computational cost and failure probability. An additional depth modality helps in this regard, but there are further cues that can be seen by a ToF sensor that have remained underexploited so far. In this paper we take the first steps towards a trimodal ToF camera, which adds a valuable material modality to the classical intensity and depth modalities. To this end, ToF raw data is used to obtain Fourier samples of the material impulse response function (MIRF) to modulated illumination. The MIRF depends on the surface- and sub-surface-level scattering mechanisms of the material and, thus, can be used to identify materials of different nature. Consequently, distinctive feature vectors can be obtained from the Fourier measurements. Additionally, including the incidence angle in the feature vectors allows capturing the MIRF dependency on this parameter. Experimental validation confirms the feasibility of this approach. We also constructed a live demonstrator of our trimodal ToF camera.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []