Deep style transfer for light fields

18th October 2018

Proposed by: Martin Alain – alainm at scss.tcd.ie
Mairead Grogan – groganma at scss.tcd.ie
Yang Chen – cheny5 at scss.tcd.ie

Style transfer [1,2], the ability to automatically edit an image in the style of a reference image or painting, has gained considerable interest in the past few years, yielding impressive results thanks to the recent advances of deep learning.

The goal of this project is to apply style transfer to light fields. Compared to traditional 2D imaging systems which capture the spatial intensity of the light rays, the 4D light fields also contain the angular direction of light rays, and usually consists in a collection of 2D images arranged on a 2D grid. Thus the focus of this work will be to not only render convincing style transfer for each 2D image of the light field, but also preserve the angular structure of the 4D light field. For that purpose, novel deep networks architecture need to be explored.

[1] Sanakoyeu, Artsiom; Dmytro, Sabine; Björn; A Style-Aware; Content Loss for Real-time HD Style Transfer, arxiv 1807.10201, 2018
[2] Liu, Xiao-Chang; Cheng, Ming-Ming; Lai, Yu-Kun; , Paul L.; Depth-aware neural style transfer, NPAR 2017

Related links:
https://compvis.github.io/adaptive-style-transfer/
https://github.com/ycjing/Neural-Style-Transfer-Papers