A Spatio-Angular Binary Descriptor for Fast Light Field Inter View Matching
7th February 2020
Abstract
Light fields are able to capture light rays from a scene arriving at different angles, effectively creating multiple perspective views of the same scene. Thus, one of the flagship applications of light fields is to estimate the captured scene geometry, which can notably be achieved by establishing correspondences between the perspective views, usually in the form of a disparity map. Such correspondence estimation has been a long standing research topic in computer vision, with application to stereo vision or optical flow. Research in this area has shown the importance of well designed descriptors to enable fast and accurate matching. We propose in this paper a binary descriptor exploiting the light field gradient over both the spatial and the angular dimensions in order to improve inter view matching. We demonstrate in a disparity estimation application that it can achieve comparable accuracy compared to existing descriptors while being faster to compute.
This paper has now been accepted to ICIP 2020. You can watch the presentation below.
The proposed Spatio-Angular Binarised Orientation Maps (SABOM) descriptor is compared to SIFT, DAISY, and the original BOOM descriptor.
ROC performance
HCI dataset
INRIA Dense dataset
INRIA Sparse dataset
Disparity estimation
CPM – HCI
CPM – INRIA
CPM+PF – HCI
CPM+PF – INRIA
Visual results
We show here comparisons between the disparity maps obtained with the SIFT descriptor and the proposed SABOM descriptor for the HCI benchmark, the INRIA synthetic dataset, and the Stanford Lego Gantry dataset. As mentioned in the paper, the disparity estimation is used to compare existing descriptors but would require further improvement to compete with the best state-of-the-art method for disparity estimation. The disparity maps are color-coded with the Turbo colormap. Overall, similar quality is obtained for both descriptors for while SABOM is much faster to compute.