XR Play Trilogy
15th May 2020XR Play consists in a three-year trilogy of research experiments at the cross-roads of the performing arts and computer science. Each part of the trilogy reimagines Samuel Beckett’s ground-breaking theatrical text, Play (1963), for various formats of digital culture, including 1) a webcast (Intermedial Play), 2) a Virtual Reality (Virtual Play) and, 3) an Augmented Reality (Augmented Play).
===========================
1.0 Intermedial Play
Intermedial Play was a collaboration between Néill O’Dwyer (V-SENSE artist in residence) and Nicholas Johnson, assistant professor in Trinity College’s Department of Drama and secretary and co-director of the newly established Trinity Centre for Beckett Studies. Intermedial Play was the precursor, inspiration and catalyst of Virtual Play and Augmented Play.
===========================
2.0 Virtual Play
Virtual Play is a reinterpretation of Play, with a view to engaging a 21st Century viewership that is increasingly accessing content via virtual reality (VR) technologies. It is V-SENSE’s inaugural creative arts/cultural project, under the creative technologies remit. This project has been conceived in order to demonstrate how VR content can be produced both cheaply and expertly, thereby challenging the notion that sophisticated VR content is exclusively the domain of wealthy institutes and production houses. The core technology enabling this novel type of creative production, i.e. VR/AR content creation based on 3D volumetric video techniques, has been developed by V-SENSE researchers David Monaghan, Jan Ondrej, Konstantinos Amplianitis and Rafael Pagés who subsequently spun out an innovative company VR production company called Volograms.
Under the guidance of Néill O’Dwyer (Producer), this virtual reality response to Play attempts to push the limits of possibility in consumable video and film by eliciting the new power of digital interactive technologies, in order to respond to Samuel Beckett’s deep engagement with the stage technologies of his day.
A central goal of the project is to address ongoing concerns in the creative cultural sector, regarding how to address the question of narrative progression in an interactive immersive environment. It is believed that by placing the viewer (audience) at the centre of the storytelling process, they are more appropriately assimilated to the virtual world and are henceforth empowered to explore, discover and decode the story, as opposed to passively watching and listening. This is something that has been effectively harnessed by the gaming sector using procedural graphics and animation, but film and video have struggled to engage this problem effectively, using audio-visual capture techniques. As such, this project attempts to investigate these new narrative possibilities for interactive, immersive environments.
In order to investigate this problem, V-SENSE drafted in the expertise of Samuel Beckett scholar Nicholas Johnson (Director). By joining the project, Nick brings with him a wealth of knowledge in relation to the complexities, technicalities and nuances of staging Beckett productions. The project is complementary to his ongoing work with the Samuel Beckett Laboratory and forthcoming research project Intermedial Beckett, and will feed into research questions in contemporary Beckett Studies and the methodologies of interdisciplinary practice-as-research. Three professional actors – Colm Gleeson, Caitlin Scott and Maeve O’Mahony – round out the Drama team. All three are trusted collaborators of Johnson and have experience with Beckett texts in performance. A high degree of precision from the actors is crucial to the success of the project, because of the difficulty in post producing video footage captured on multiple devices.
In terms of the mise en scène, the strategy consists in constructing a 3D re-interpretation of Beckett’s scene and characters, which he describes as ‘lost to age and aspect’ (Beckett, 1963), using bespoke volumetric video techniques for capturing live action. The actors are recorded against a green screen, using a multiple camera setup. Their foreground masks are extracted from the background using segmentation techniques. These masks are combined to create a dynamic photo-realistic 3D reconstruction of every actor in the scene. These reconstructions are then imported into a game engine and combined with virtual set elements, in order to create the immersive VR experience. The game engine software is also used to implement the rules and conditions that define the user interaction and behaviour.
In order to help embellish the immersive nature of the scene, V-SENSE have also drawn in collaboration from Enda Bates (Sound Designer), a lecturer on the Music Media Technologies (MMT) masters programme, in the Department of Electrical and Electronic Engineering. Enda’s work with the Spatial Audio Research Group in Trinity and the ongoing Trinity360 project concerns the use and production of spatial audio using 6 degrees of freedom (6DoF) for Virtual Reality, Augmented Reality and 360 Video. Enda deploys Ambisonic audio and spatial audio SDKs for game engines in order to give the user a perception of depth, distance and audio directivity in the virtual world. The implementation of the audio for viewing volumetric video is the main focus of Enda’s contribution to the project. He achieves this by synthesizing directivity patterns relating to the locations of the viewer and characters within the virtual set design.
The project is an important milestone for V-SENSE, because it is the inaugural artistic-cultural experiment under the creative technologies remit, as defined by Prof. Aljosa Smolic (PI) in his procurement of funding from Science Foundation of Ireland (SFI). It represents a significant effort within the research group, by drawing together discrete areas of computer science research, and in the college as a whole, because it engenders interdisciplinary collaboration across the departments of Computer Science, Drama and Electrical and Electronic Engineering.
===========================
3.0 Augmented Play
It is the third and final part of the three-year practice-based research trilogy. It has much in common with the earlier VR version because it uses the same volumetric video assets, and the mode of user interaction and narrative development is also similar. However, the viewing paradigm is different; they are displayed using the either the Microsoft HoloLens or the Magic Leap augmented reality (AR) head-mounted displays (HMD).
AR is a technology that allows people/audiences to visually merge virtual/graphical/computer-rendered objects with real world objects and scenes, using a mobile phone, a tablet or a head-mounted display (HMD). As opposed to virtual reality, it permits the user to see the world around them; it does not close off the outside world by fully immersing the audience in computerised world. Therefore, AR technology is highly suitable to allowing people to interact with stories at site-specific locations, which means it is ideal for location-based, immersive, role-play and site-specific drama.
The user–narrative paradigm is also the same as the virtual reality version; experiencers are invited to don the AR HMD, embody an interrogator and explore the narrative by confronting virtual reconstructions of Beckett’s characters. The user activates the characters into speaking by looking at them, so the artwork acknowledges the new condition of active audiences and recognises new opportunities for narrative, by affording audiences a central role in its unfurling.
Considering the technology’s suitability for site-specific drama, we launched the in the cavernous vaulted stone basement of the CHQ building which is appropriate to crypt-like, posthumous setting originally envisaged by Beckett. The project has been a resounding success and has received special commendations from the following leading local, national and professional sector media channels:
- Culture File on RTE Lyric FM
- The Irish Times
- Trinity College’s News and Events website.
- The Foundry’s News Blog, “Insights Hub”.
As a pioneering production of augmented reality drama using VV technology, the project is a ground-breaking milestone for: V-SENSE; Volograms, a domestic SME leading VR, AR and Mixed Reality (MR) technologies for the creative cultural industries; and, the Trinity Centre for Beckett Studies at the Dept. of Drama (TCD), who were pivotal in the production, translation and dramaturgical direction of Beckett’s story.
===============
Credits:
Principal Investigator: Aljosa Smolic
Director: Nicholas Johnson
Producer & Scenographer: Néill O’Dwyer
Post-production: Jan Ondrej, Rafael Pagés, Konstantinos Amplianitis and David Monaghan
Sound Designer: Enda Bates
Actors: Colm Gleeson as M, Maeve O’Mahony as W1, and Caitlin Scott as W2.
The team wishes to thank the Samuel Beckett Estate for permitting the use of Beckett’s original text.
In addition to the primary funding source from Science Foundation Ireland (SFI), this project also received partial funding from the Trinity Visual and Performing Arts Fund, and the Irish Research Council, and The Trinity Long Room Hub Interdisciplinary Seed Funding (2017-18).