Siamese CNNs for RGB-LWIR Disparity Estimation
2019
Currently, for the task of color (RGB) and thermal infrared (LWIR) disparity estimation, handcrafted feature descriptors such as mutual information are the methods achieving best performance. In this work, we aim to assess if convolutional neural networks (CNNs) can achieve competitive performance in this task. We developed an architecture made of two subnetworks, each consisting of the same siamese network, but taking different image patches as input. Each siamese network, in the feature space, searches for the disparity between the left and right patch. The out-put of the two subnetworks are summed together so that we can be more confident in the predicted disparity by enforcing left-right consistency. We show that having two subnet-works working together in parallel to get the final prediction helps achieve better performance when compared to a single subnetwork by itself. We tested our method on the LITIV dataset and found the results competitive when compared to handcrafted feature descriptors. The source code of our method will be available online upon publication.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
29
References
0
Citations
NaN
KQI