Virtual Play, after Samuel Beckett21st July 2017
Virtual Play is a reinterpretation of Samuel Beckett’s ground-breaking 1963 text, Play, with a view to engaging a 21st Century viewership that is increasingly accessing content via virtual reality technologies. It is the inaugural creative arts/cultural project by V-SENSE under their creative technologies remit. V-SENSE is a leading computer science research group at Trinity College Dublin championing, among others, the development of 3D reconstruction techniques, Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) technologies for the creative cultural industries. This project has been conceived in order to demonstrate how VR content can be produced both cheaply and expertly, and therefore challenges the notion that sophisticated VR content is exclusively the domain of wealthy institutes and production houses. The core technology enabling this novel type of creative production, i.e. VR/AR content creation based on 3D reconstruction techniques, has been developed by V-SENSE researchers David Monaghan, Jan Ondrej, Konstantinos Amplianitis and Rafael Pagés.
Under the guidance of resident V-SENSE digital media artist Néill O’Dwyer (Producer), this virtual reality response to Play attempts to push the limits of possibility in consumable video and film by eliciting the new power of digital interactive technologies, in order to respond to Samuel Beckett’s deep engagement with the stage technologies of his day.
A central goal of the project is to address ongoing concerns in the creative cultural sector, regarding how to address the question of narrative progression in an interactive immersive environment. It is believed that by placing the viewer (audience) at the centre of the storytelling process, they are more appropriately assimilated to the virtual world and are henceforth empowered to explore, discover and decode the story, as opposed to passively watching and listening. This is something that has been effectively harnessed by the gaming sector using procedural graphics and animation, but film and video have struggled to engage this problem effectively, in terms of audio-visual capture techniques. As such, this project attempts to investigate these new narrative possibilities for interactive, immersive environments.
In order to investigate this problem, V-SENSE drafted in the expertise of Samuel Beckett scholar Nicholas Johnson (Director), who is assistant professor in Trinity College’s Department of Drama and secretary and co-director of the newly established Trinity Centre for Beckett Studies. By joining the project, Nick brings with him a wealth of knowledge in relation to the complexities, technicalities and nuances of staging Beckett productions. The project is complementary to his ongoing work with the Samuel Beckett Laboratory and forthcoming research project Intermedial Beckett, and will feed into research questions in contemporary Beckett Studies and the methodologies of interdisciplinary practice-as-research. Three professional actors – Colm Gleeson, Caitlin Scott and Maeve O’Mahony – round out the Drama team. All three are trusted collaborators of Nicholas Johnson and have experience with Beckett texts in performance. A high degree of precision from the actors is crucial to the success of the project, because of the difficulty in post producing video footage captured on multiple devices.
In terms of the mise en scène, the strategy consists in constructing a 3D re-interpretation of Beckett’s scene and characters, which he describes as ‘lost to age and aspect’ (Beckett, 1963), using bespoke 3D reconstruction techniques. The actors are recorded against a green screen, using a multiple camera setup. Their foreground masks are extracted from the background using novel segmentation algorithms. These masks are combined to create a dynamic photo-realistic 3D reconstruction of every actor in the scene. These reconstructions are then imported into a game engine and combined with virtual set elements, in order to create the immersive VR/AR/MR experience. The game engine software is also used to implement the rules and conditions that define the user interaction and behaviour.
In order to help embellish the immersive nature of the scene, V-SENSE have also drawn in collaboration from Enda Bates (Sound Designer), a lecturer on the Music Media Technologies (MMT) masters programme, in the Department of Electrical and Electronic Engineering. Enda’s work with the Spatial Audio Research Group in Trinity and the ongoing Trinity360 project concerns the use and production of spatial audio using 6 degrees of freedom (6DoF) for Virtual Reality, Augmented Reality and 360 Video. Enda deploys Ambisonic audio and spatial audio SDKs for game engines in order to give the user a perception of depth, distance and audio directivity in the virtual world. How the audio can be implemented for volumetric video capture is the main focus of Enda’s contribution to the research project. Can this be achieved by synthesizing different directivity patterns, or is it necessary to also set up a similar volumetric audio capture? The audio is also a central triggering device for drawing the users’ attention, in order to progress narrative at certain spatio-temporal junctures.
The project represents an important milestone for the V-SENSE research project as a whole, because it is the inaugural artistic-cultural experiment under the creative technologies remit, as defined by Prof. Aljosa Smolic in his procurement of funding from Science Foundation of Ireland (SFI). It represents a significant effort, not only within the research group by drawing together discrete research areas within computer science, but also in the college as a whole, because it engenders interdisciplinary collaboration across the departments of Computer Science, Drama and Electrical and Electronic Engineering.
International Journal of Performance Arts and Digital Media, 2019, ISSN: 1479-4713.
The MIT Press Journals - Leonardo, pp. 10, 2018.
International Conference on Interactive Digital Storytelling (ICIDS 2018) 2018.
Proceeding SIGGRAPH '18, ACM SIGGRAPH ACM SIGGRAPH, New York, NY, USA, 2018, ISBN: 978-1-4503-5817-0 .
16th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 262-267, IEEE Xplore digital library, 2017.