Demos

The following demonstrations will be presented on the third day of the conference (June 7th) as part of our interactive conference ‘bazaar’ in BBC’s Quay House. There will also be a series of lighting talks introducing the demos at 17:00-17:30 on Day 2 (June 6th) in the Quays Theatre.

ImAc Player: Enabling a Personalized Consumption of Accessible Immersive Contents

  • Mario Montagud Climent – Media & Internet Area, i2cat Foundation, Barcelona, Spain Department of Informatics, University of Valencia, Valencia, Spain
  • Isaac Fraile – i2CAT, Barcelona, Spain
  • Einar Meyerson Fundació – i2CAT, Barcelona, Spain
  • María Genís – i2CAT Foundation, Barcelona, Barcelona, Spain
  • Sergi Fernández – i2CAT, Barcelona, Spain

Abstract: Accessibility is a fundamental requirement for every (multimedia) service. Although immersive media services are on the rise, they still lack of accessibility features. This paper presents a web-based player that enables the presentation of immersive 360º contents augmented by a set of access services, like subtitles, (spatial) audio description and sign language. The paper initially provides an overview of the end-to-end broadcast platform in which the player is integrated. Then, the key components that make up the player and its appearance are briefly introduced. Finally, the different accessibility, personalisation and interaction features implemented in the player are described. The player is being tested in a series of pilot actions involving users with accessibility needs, is being used as a proof of concept in different standardization activities, and is envisioned to be integrated into the services provided by European broadcasters.

Abstract Painting Practice: Expanding in a Virtual World

  • Alison Goodyear – Faculty of Arts, Science and Technology, University of Northampton, Northampton, United Kingdom
  • Dr Mu Mu – The University of Northampton, Northampton, United Kingdom

Abstract: This paper sets out to describe, through a demo for the TVX Conference, how virtual reality (VR) painting software is beginning to open up as a new medium for visual artists working in the field of abstract painting. The demo achieves this by describing how an artist who usually makes abstract paintings with paint and canvas in a studio, that is those existing as physical objects in the world, encounters and negotiates the process of making abstract paintings in VR using Tilt Brush software and Head-Mounted Displays (HMD). This paper also indicates potential future avenues for content creation in this emerging field and what this might mean not only for the artist and the viewer, but for art institutions trying to provide effective methods of delivery for innovative content in order to develop and grow new audiences.

Deb8: A Tool for Collaborative Analysis of Video

  • Guilherme Carneiro – School of Computer Science, University of St Andrews, St Andrews, United Kingdom

Abstract: Public, parliamentary and television debates are commonplace in modern democracies. However, developing an understanding and communicating with others is often limited to passive viewing or, at best, textual discussion on social media. To address this, we present the design and implementation of Deb8, a tool that allows collaborative analysis of video-based TV debates. The tool provides a novel UI designed to enable and capture rich synchronous collaborative discussion of videos based on argumentation graphs that link quotes of the video, opinions, questions, and external evidence. Deb8 supports the creation of rich idea structures based on argumentation theory as well as collaborative tagging of the relevance, support and trustworthiness of the different elements. We report an evaluation of the tool design and a reflection on the challenges involved.

Framework for Web Delivery of Immersive Audio Experiences Using Device Orchestration

  • Kristian Hentschel – BBC R&D, Salford, United Kingdom
  • Jon Francombe – BBC R&D, Salford, United Kingdom

Abstract: This demonstration introduces the use of orchestrated media devices and object-based broadcasting to create immersive spatial audio experiences. Mobile phones, tablets, and laptops are synchronised to a common media timeline and contribute one or more individually delivered audio objects to the overall mix. A rule set for assigning objects to devices was developed through a trial production—a 13-minute audio drama called The Vostok-K Incident. The same timing model as in HbbTV2.0 media synchronisation is used, and future work could augment linear television broadcasts or create novel interactive audio-visual experiences for multiple users. The demonstration will allow delegates to connect their mobile phones to the system. A unique mix is created based on the number and selected locations of connected devices.

Snapscreen Clip Share: Utilizing Computer Vision to Bridge TV and Social Media

  • Mr Thomas Willomitzer – Snapscreen, Vienna, Austria
  • Tyler Tracy Snapscreen – Snapscreen, Vienna, Austria
  • Markus Rumler Snapscreen – Snapscreen, Vienna, Austria

Abstract: This demonstration showcases Snapscreen Clip Share: a second-screen technology for seamless identification and social sharing of live or recorded TV content. With Clip Share, app users take a snapshot of their viewing screen to generate a broadcast-quality clip of the current program instantly on their mobile device; then, users rewind through the retrieved segment, trim the beginning and end of their clip, add a personal message to kick-off discussion, and share the clip through a range of messaging apps and social media platforms. Where existing clip solutions allow broadcasters and rights-holders to produce clips from broadcast content, Clip Share facilitates fast and easy clipping for app users in order to drive content distribution and recirculation by viewers themselves. Leveraging computer vision to streamline clip creation and sharing provides an intuitive bridge between TV content and social media interactions.

Visualizing Gaze Presence for 360° Camera

  • David A. Shamma – FXPAL, Palo Alto, California, United States
  • Tony Dunnigan – FXPAL, Palo Alto, California, United States
  • Yulius Tjahjadi – FXPAL, Palo Alto, California, United States
  • John J Doherty – FXPAL, Palo Alto, California, United States

Abstract: Advancements in 360° cameras have increased their related livestreams. In the case of video conferencing, 360° cameras provide almost unrestricted visibility into a conference room for a remote viewer without the need for an articulating camera. However, local participants are left wondering if someone is connected and where remote participants might be looking. To address this, we fabricated a prototype device that shows the gaze and presence of remote 360° viewers using a ring of LEDs that match the remote viewports. We discuss the long term use of one of the prototypes in a lecture hall and present future directions for visualizing gaze presence in 360° video streams.

Situated Immersion: The Living Room of the Future

  • Adrian Gradinar – Imagination Lancaster, Lancaster University, Lancaster, Lancashire, United Kingdom
  • Joseph Lindley – Lancaster University, Lancaster, Lancashire, United Kingdom
  • Paul Coulton – LICA, Lancaster University, Lancaster, United Kingdom
  • Mr Ian Forrester – BBC R&D, Manchester, United Kingdom
  • Phil Stenton – BBC Research & Development, BBC, Salford, Manchester, United Kingdom

Abstract: This paper presents the Living Room of the Future which explores new forms of immersive experience which utilises Object Based Media to provision media that is personalised, adaptable, dynamic, and responsive. It builds upon work on Perceptive Media, Internet of Things Storytelling, and Experiential Futures which, in contrast to approaches that simply conflate immersion with increased visual fidelity, proposes subtle and nuanced ways to immerse audiences in a situated context. The room-sized prototype demonstrates this approach to immersion and includes connected devices that provide contextual data to personalise the media as well as providing physical elements to enhance the immersive experience.