Robust global and local color matching in stereoscopic omnidirectional content

24th December 2019

Abstract

Shooting a live-action immersive 360-degree experience, i.e. omnidirectional content (ODC) is a technological challenge as there are many technical limitations which need to be overcome, especially for capturing and post-processing in stereoscopic 3D (S3D). In this paper, we introduce a novel approach and entire system for stitching and color mismatch correction and detection in S3D omnidirectional content, which consists of three main modules: pre-processing, spherical color correction and color mismatch evaluation. The system and its individual modules are evaluated on two datasets, including a new dataset which will be publicly available with this paper. We show that our system outperforms the state of the art in color correction of S3D ODC and demonstrate that our spherical color correction module even further improves the results of the state of the art approaches.Shooting a live-action immersive 360-degree experience, i.e. omnidirectional content (ODC) is a technological challenge as there are many technical limitations which need to be overcome, especially for capturing and post-processing in stereoscopic 3D (S3D). In this paper, we introduce a novel approach and entire system for stitching and color mismatch correction and detection in S3D omnidirectional content, which consists of three main modules: pre-processing, spherical color correction and color mismatch evaluation. The system and its individual modules are evaluated on two datasets, including a new dataset which will be publicly available with this paper. We show that our system outperforms the state of the art in color correction of S3D ODC and demonstrate that our spherical color correction module even further improves the results of the state of the art approaches.

Citation

Paper accepted in the journal Elsevir Signal Processing: Image Communication 

Please cite our paper in your publications if it helps your research:

Downloads

Paper

Acknowledgment

This publication has emanated from research conducted with the financial support of Science Foundation Ireland (SFI) under the Grant Number 15/RP/2776.

Contact

If you have any question, send an e-mail at crocis@scss.tcd.ie or  roman.dudek101@alu.ulpgc.es.