Join our team – we are hiring! New Research Fellow in Creative Technologies!

The post will research, develop, pilot and demonstrate a set of professional tools and techniques for making content ‘smarter’, so that it is fully adaptive in a broad, unprecedented manner: adaptive to context (which facilitates re-use), to purpose (among or within industries), to the user (improving the viewing experience), and to the production environment (so that it is ‘future-proof’). The approach is based on research into computer animation; automated classification and tagging using deep learning and semantic labelling to describe and draw inferences; and the development of tools for automated asset transformation, smart animation, storage and retrieval. These new technologies and tools will show that a vast reduction of costs and increases in efficiency are possible, facilitating the production of more content, of higher quality and creativity, for the benefit of the competitiveness of the European creative industries.

All important information is here!

Seminar presentation by Professor Julián Cabrera Quesada, Universidad Politécnica de Madrid

 

 

 

 

 

 

 

 

Title:

Stochastic optimal control of HTTP Adaptive streaming


Abstract:

HTTP Adaptive Streaming (HAS) is becoming a key technology for audiovisual broadcasting through IP networks. This technology has been adopted and developed by important vendors such as Microsoft, Apple or Adobe and the creation of an MPEG standard (MPEG-DASH) has also contributed to its success for multimedia broadcasting.  Important IP content providers such as Netflix, Amazon, HBO etc., are using this technology for their video on-demand services, and traditional IPTV providers such us Movistar TV,  are also moving to this technology for their live broadcasting services.

One of the key elements in HAS technology is the player at the client side which has to make decisions in order to provide the best possible video quality. These decisions have to consider the dynamic network conditions, the device features and the user ?s profile and preferences. In this talk, the behaviour of the player will be described and formulated as a Markov decision problem and solutions based on Stochastic Dynamic Programming and Reinforcement learning will be presented.

Short Bio:

V-SENSE is delighted to welcome Professor Julián Cabrera Quesada as Visiting Professor until July 2018. Professor Julián Cabrera Quesada is Associate Professor in Signals, Systems and Radiocommunications at the Department at the Telecommunication School of the Universidad Politécnica de Madrid (UPM) and Researcher at Image Processing Group (Grupo de Tratamiento de Imágenes). He lectures in Digital Image Processing, Transmission systems, Digital Television, Video Coding, Audiovisual Communications, and Reinforcement Learning.  He has participated in more than 25 research projects funded by European programs, Spanish national programs and private companies. Current research interests cover several topics related to audio-visual communications, advance video coding for UHD, 3D and Multiview scenarios, depth estimation and coding, video subjective quality assessment for Multiview  and VR360 video, and  optimization of adaptive streaming techniques. He is working on the application of deep learning approaches to depth estimation and 3D reconstruction.

Event information:

12-1pm, Tuesday, 24th Oct 2017
Large Conference Room, O’ Reilly Institute

Closing the Content Gap for AR and VR presented by Professor Smolic, SFI Research Professor of Creative Technologies at Trinity College

Technology Ireland Innovation Forum/Ibec with Trinity College presents Closing the Content Gap for AR and VR on Wednesday, 18th October 2017


Virtual Reality and Augmented Reality are already impacting global business. Historically associated with the entertainment industry, business is now embracing Virtual, Augmented and Mixed Reality to gain 360 degree insights into various aspect of their organisations. This coupled with applications in everyday societal needs, is driving new opportunities and markets globally.

Professor Aljosa Smolic the SFI Research Professor of Creative Technologies at Trinity College, is a recognised leader in his field of visual computing, with particular expertise in computer graphics, computer vision, and video signal processing. His work has had a significant practical impact on creative industries in the areas of 3D video technology, forming the basis for 3D Blu-ray and stereo 3D production technology.

So why attend:
Aljosa will talk about his research project, V-SENSE – Extending Visual Sensation through Image-Based Visual Computing, which is funded by SFI over 5 years. V-SENSE is a team of 20+ researchers in Visual Computing at the intersection of Computer Vision, Computer Graphics and Media Signal Processing. It is building a dynamic environment where enthusiastic young scientists with different backgrounds get together to shape the future in fundamental as well as applied research projects. Insights include but are not limited to:
Augmented, Virtual and Mixed Reality (AR/ VR/ MR)
Free View-point Video (FVV)
360-Video
Light-Field Technologies
Visual Effects and Animation

Date: 18 October 2017
Time(s): 0800-1030
Venue: Ibec, 84-86 Lower Baggot Street, Dublin 2

V-SENSE at the Intermedial Beckett Symposium Saturday, 14 October 2017, 11am – 7pm

Samuel Beckett changed the theatre forever by using the new media of his time. Since his death in 1989, the analogue stage and screen technologies of the 20th century have given way to various forms of digital telepresence, and experiments in translating Beckett across media abound. In partnership with the Dublin Theatre Festival and the Trinity Long Room Hub, The Trinity Centre for Beckett Studies will curate a day of presentations, conversations, and lectures by leading experts and artists to discuss the impact of intermedial performance, contemporary art, and Beckett’s legacy.

Schedule:
11:00-1:00: Beckett and Theories of Intermediality
Anna McMullan — “Samuel Beckett: Intermedial Legacies”
David Houston Jones — “Samuel Beckett: Face, Installation, Embodiment”
Panel Discussants: Matthew Causey, Derval Tubridy, Catherine Laws
Chair: Nicholas Johnson

1:00-2:00: Lunch (provided)
Virtual Play installation in the Hoey Ideas Space

2-3:30: Beckett in Virtual Reality
Panel discussion with the members of the V-SENSE research project
Aljosa Smolic, Néill O’Dwyer, Enda Bates, Nicholas Johnson

3:30-4:00: Coffee (provided)

4:00-6:00: Beckett and Practices of Intermediality
Derval Tubridy — “Intermediality, Agency, Diversity”
Catherine Laws — “Beckett, Music, and the Intermedial”
Discussants: Ciaran Clarke, Angela Butler, Anna McMullan
Chair: Julie Bates

6:00 PM: Launch of the Trinity Centre for Beckett Studies
Jane Ohlmeyer, director of the Trinity Long Room Hub
Sam Slote, director of the Trinity Centre for Beckett Studies

The symposium is hosted by Dr Nicholas Johnson, Prof David Houston Jones, Dr Catherine Laws, Prof Anna McMullan, Dr Sam Slote, and Dr Derval Tubridy. Kindly supported by interdisciplinary seed funding from the Trinity Long Room Hub, and in partnership with the Dublin Theatre Festival, the School of Creative Arts, and V-SENSE (SFI-funded project held by Prof. Aljosa Smolic).

This event is free but does require registration- you can register for this event here

Campus LocationTrinity Long Room Hub
Accessibility: Yes
Room: Neill Lecture Theatre
Event Type: Alumni, Arts and Culture, Conferences, Lectures and Seminars, Public
Type of Event: One-time event
Audience: Undergrad, Postgrad, Alumni, Faculty & Staff, Public
Cost: Free (but registration is required)

 

Join us at Probe with Samuel Beckett’s Play in Virtual Reality!

Samuel Beckett’s Play in Virtual Reality @ Probe
(Trinity Research Night).

V-SENSE present their inaugural creative-cultural experiment Virtual Play at Trinity Research Night (aka Probe), in the Maker

Marquis at Front Square. Visitors can use a virtual reality headset to interact with Beckett’s story, re-imagined for digital culture. The project demos cutting-edge free-viewpoint video

technology. Participation from the departments of Drama and Electrical and Electronic Engineering represent a significant interdepartmental collaborative effort.

Where: Trinity Research Night (aka Probe), in the Maker Marquis at Front Square.

When: 5pm – 8pm, Friday 29 September 2017.

Admission: Free all over campus.

Congratulations Yang Chen on receiving The Best Paper Award at The Irish Machine Vision and Image Processing Conference, 2017.

We are delighted to announce that our paper: Chen, Yang; Alain, Martin; Smolic, Aljosa, Fast and Accurate Optical Flow based Depth Map Estimation from Light Fields received the Jonathan Campbell Best Paper Award at The Irish Machine Vision and Image Processing Conference, 2017. Huge congratulations to our PhD student Yang and all involved in this successful outcome! https://v-sense.scss.tcd.ie/?page_id=349


PhD student Yang presenting at the conference.