Style Transfer for 360 Images

10th October 2019

Proposed by:

Koustav Ghosal: ghosalk at scss.tcd.ie

Sebastian Lutz: lutzs at scss.tcd.ie

Pierre Matysiak: matysiap at scss.tcd.ie

 

The aim of this project is to extend neural style transfer algorithms to the domain of 360 images. Style Transfer refers to creating a new output image by borrowing from the style and content of two different input images. However, while style transfer is being widely studied in the context of natural images, it has seldom been explored for 360 images. Emerging technologies such as 360 imagery and videos have many potential applications especially in the creative industry such as gaming, animation and virtual reality applications.

In this project, the tasks will cover but not limited to

  1. Reading existing research in Neural Style Transfer and 360 image processing
  2. Implementing a deep learning based framework using PyTorch/Tensorflow
  3. A thorough evaluation and analysis of the proposed framework

References

  1.  Gatys, Leon A., Alexander S. Ecker, and Matthias Bethge. “A neural algorithm of artistic style (2015).
  2. Jing, Yongcheng, et al. “Neural style transfer: A review.” IEEE transactions on visualization and computer graphics (2019).
  3. Ruder, Manuel, Alexey Dosovitskiy, and Thomas Brox. “Artistic style transfer for videos and spherical images.” International Journal of Computer Vision 126.11 (2018): 1199-1219.