User behaviour analysis and modelling for augmented reality and virtual reality multimedia

18th October 2018

Proposed by Emin Zerman – zermane at scss.tcd.ie

Capturing and displaying volumetric videos are becoming easier with the developing technology for augmented reality (AR), mixed reality (MR), and virtual reality (VR) applications. Until recently, the presentation of the visual (multi-)media was traditionally done on a 2D plane e.g. television or cinema. The viewers are assumed to be passive for both photography and motion picture due to the limitations of the media. However, AR and VR multimedia brings new challenges for the content creators, as they have to consider how their users will interact with this new type of multimedia.

In this project, the goal is to understand, analyse, and model the user behaviour during AR and VR multimedia consumption with a head-mounted display (HMD). Understanding user behaviour is very important for both content creators and engineers as the content (the way it is presented and/or compressed) can be adapted to different user models. For this purpose, the position and orientation of the HMD need to be found and recorded for each user during stimuli presentation. This data will be then analysed using statistical methods.

Useful references:
[1] A. Singla, S. Fremerey, A. Raake, P. List, and B. Feiten, “AhG8: Measurement of User Exploration Behavior for Omnidirectional (360°) Videos with a Head Mounted Display.” Macao, China, 2017.
[2] S. Egger, P. Reichl, and K. Schoenenberg. “Quality of experience and interactivity.” In: S. Möller, A. Raake (eds) Quality of experience. Springer, Cham, 2014. 149-161.