WORK-IN-PROGRESS
The Work-in-Progress (WiP) session solicits recent viewpoints, new discoveries, and early-stage design and development in disciplines that are in line with TVX’s areas of interest. It provides a unique opportunity for exchanging brave new ideas, receiving feedback and fostering collaborations. This year, we also introduce a Project-in-Progress special track, targeting contributions from ongoing major research initiatives including European Commission-funded or other similar-scale projects for cross-project discussions.
We received a total number of 17 high quality submissions (including 14 on the standard track and 3 on the Project-in-Progress track) from Europe, America and Asia. The authors of three submissions participated in the Mentoring Programme for feedback and guidance from experienced researchers. Based on reviews from Associate Chairs and other experts in the field, 10 submissions have been accepted.
WiP papers will be presented as a short pitch presentation and a physical poster at the conference, and will be included in conference adjunct proceedings indexed in the ACM Digital Library. We will also announce the winner of WiP Best Paper Award soon.
Since the authors are preparing their camera-ready versions, the titles and abstracts below are still subject to change.
List of accepted papers:
- Subtitles in 360-degree Video
- User Research and Design for Live TV UX in China
- HEd: A Flexible HbbTV WYSIWYG Visual Authoring Tool
- Investigating the Effect of Relative Time Delay on Companion Screen Experiences
- A Middleware to Enable Immersive Multi-Device Online TV Experience
- Beyond The Timeline a Data-Driven Interface For Interactive Documentary
- Exploring Online Video Databases by Visual Navigation
- Mobile Devices and Professional Equipment Synergies for Sport Summary Production
- Enhancing Use of Social Media in TV Broadcasting
- Multi-Screen Director: a New Role in the TV Production Workflow?
Paper Details
- Subtitles in 360-degree Video
Andy Brown – Research & Development, BBC, Salford, United Kingdom
Jayson Turner – BBC Research and Development, BBC, Manchester, United Kingdom
Jake Patterson – Research & Developement, BBC, Salford, United Kingdom
Anastasia Schmitz – Research & Development, BBC, Salford, United Kingdom
Mike Armstrong – BBC Research and Development, BBC, Manchester, United Kingdom
Maxine Glancy – BBC Research & Development, BBC, Manchester, United Kingdom
Abstract: Currently there exists no agreed-upon user experience guidelines regarding subtitling (closed captions) in immersive 360-degree video experiences. It is not clear how subtitles might be acceptably displayed within this context, namely to support the balance between comprehension, freedom to look around the scene, and immersion. This work-in-progress describes four subtitle behaviours that we have designed and implemented in order to perform user-testing. We describe our rationale for each behaviour and discuss our initial hypotheses surrounding a full empirical investigation.
- User Research and Design for Live TV UX in China
Rui Zhang – UX lab, Samsung Electronic (China) R&D Center, Nanjing, Jiangsu, China
Ye Deng – UX Lab, Samsung Electronics (China) R&D Center, Nanjing, Jiangsu, China
Lei Shi – UX lab, Samsung Electronic (China) R&D Center, Nanjing, Jiangsu, China
Abstract: In current Chinese TV market, Live TV programs are buffeted by internet contents from smart TV/over-the-top (OTT) box and other screens like mobile phone and pad. The user experience of Live TV watching can’t satisfy user needs. We defined Chinese typical TV users: elderly, housewife, then qualitative and quantitative studies are implemented. In the qualitative study, 30 users’ telephone interviews were implemented to understand general behavior of Chinese users when watching live TV. We designed a questionnaire according to telephone interview results, and then conducted a survey with 100 Chinese typical TV users to understand their habits, needs and pain points associated with live TV watching experience. Based on our research, we summarized 3 live TV UX directions: 1. Integrate live show functions with Internet functions. 2. Base smart services on users’ habits and needs. 3. Information visualization. We explored some design hypotheses for future live TV viewing scenarios based on those identified research directions.
- HEd: A Flexible HbbTV WYSIWYG Visual Authoring Tool
Carlos Navarrete Puentes – Systems and Computer Engineering, Universidad de los Andes, Bogotá, Bogotá, Colombia
Jose Tiberio Hernandez – Systems and Computer Engineering, Universidad de los Andes, Bogota, Bogotá, Colombia
Abstract: In the last years, no significant progress has been made towards making interactive TV a relevant tool for average household members. There is no big offer in terms of interactive content; hence, most viewers are familiar only with HD broadcasting. Due to its complexity, technical terms, high costs, and the fact that these technologies are usually designed from scratch, it is not commonly used by viewers. This paper is based on the HbbTV specification (Hybrid Broadcast Broadband TV), which has been adopted in various nations around the globe. The official statistic on HbbTV’s website show approximately 300 deployed services, thus, the average, since 2009, has been only around 3.12 services a month across the world. This paper presents HEd (HbbTV Editor) as a flexible visual authoring tool which aims to reduce the development time, complexity and costs of creating applications. In addition, HEd aims to develop improved interactive applications related to the broadcasted signal. These contents may include culture and education as it will be shown in the present case study. In it, an application about an African-Colombian culture-related children’s TV show is presented, where viewers can actually interact by playing the cultural musical instruments or watching some of the traditional music groups.
- Investigating the Effect of Relative Time Delay on Companion Screen Experiences
Wei Liang Kenny Chua – UCL Interaction Centre, University College London, London, United Kingdom
Jacob Rigby – UCL Interaction Centre, University College London, London, United Kingdom
Duncan Brumby – UCL Interaction Centre, University College London, London, United Kingdom
Vinoba Vinayagamoorthy – British Broadcasting Corporation, London, United Kingdom
Abstract: Mobile devices are increasingly used while watching television, leading to companion apps that complement programmes being developed. A concern for these applications is the extent to which device and television content need to be temporally aligned. In this study, 18 participants watched a nature programme while being shown companion content on a tablet. Temporal synchronisation of content between the devices was varied. Participants completed questionnaires measuring immersion and affect and were tested on companion content recall. While there were no statistically significant effects on these measures, qualitative interviews with participants after viewing consistently revealed that longer 10s delays in content synchronisation were frustrating. This suggests that poor content synchronisation can produce a negative companion experience for viewers and should be avoided.
- A Middleware to Enable Immersive Multi-Device Online TV Experience
Hussein Ajam – Faculty of Arts, Science and Technology, The University of Northampton, Northampton, Northamptonshire, United Kingdom
Rajiv Ramdhany – Research and Development, BBC, London, United Kingdom
Mu Mu – The University of Northampton, Northampton, United Kingdom
Abstract: Recent years have witnessed the boom of great technologies of smart devices transforming the entertainment industry, especially the traditional TV viewing experiences. In an effort to improve user engagement, many TV broadcasters are now investigating future generation content production and presentation using emerging technologies. In this paper, we introduce an ongoing work to enable immersive and interactive multi-device online TV experiences. Our project incorporates three essential developments on content authoring, device discovery, and cross-device media orchestration.
- Beyond The Timeline a Data-Driven Interface For Interactive Documentary
Mirka Duijn – HKU University of the Arts, Utrecht, Netherlands
Hartmut Koenitz – HKU University of the Arts, Utrecht, Netherlands
Abstract: In this paper, we present work on the data-driven interface for The Industry, an interactive documentary in development about the Dutch illicit drug industry. The motivation for the work was to provide a more complete overview of a highly complex matter using a form of interactive digital narratives (IDN). As it is with many complex issues, news reports on illegal drugs in the Netherlands are mostly fragmented and reactive, which makes it difficult for audiences to gain a good understanding of the topic. The approach starts by obtaining big-data sets from police and media. On this foundation, a narrative interface will be designed. This paper reports on the iterative design approach, and interface metaphors, on the lessons learned and the current state of affairs. Our intent with this paper is to fuel a discussion on narrative representations of complexity.
- Exploring Online Video Databases by Visual Navigation
Wolfgang Hürst – Utrecht University, Utrecht, Netherlands
Bruno dos Santos Carvalhal – Utrecht University, Utrecht, Netherlands
Abstract: We present an interface design for interactive exploration of large movie databases based on a concept we entitle visual navigation. Our approach aims at combining the major advantages of existing systems, which are commonly either simple but limited in functionality or powerful but complex and less engaging. To verify the potential of our idea, we performed a pilot study, which indicates the validity of our approach, highlights advantages, and pinpoints areas for improvement and future work.
- Mobile Devices and Professional Equipment Synergies for Sport Summary Production
Sujeet Mate – Nokia Technologies, Tampere, Finland
Igor Curcio – Nokia Technologies, Tampere, Finland
Ranjeeth Shetty – Tampere University of Technology, Tampere, Finland
Francesco Cricri – Nokia Technologies, Tampere, Finland
Abstract: We present a novel approach for sport video summary production that leverages the best aspects of mobile devices and professional equipment. The proposed recording set-up and workflow, consisting of both types of devices has two main advantages compared to conventional techniques. Firstly, it reduces cost of content production by reducing the cost of equipment and crew required for content capture. Secondly, it reduces the time for content production by leveraging automation. Subsequently, a tunable summary production approach is presented for creating a multi-camera representation of a salient event. Incorporating cinematic rules creates aesthetically pleasant viewing experience. Interactive production of the summary enables professional users as well as second screen device (mobile, tablet, etc.) users to create a summary, where inclusion of highly ranked salient events can be done based on the subjective viewing value. Furthermore, automation provides a framework for easy inclusion of crowdsourced content. The proposed hybrid production method is illustrated here by considering basketball as an example.
- Enhancing Use of Social Media in TV Broadcasting
Sebastian Arndt – NTNU Norwegian University of Science and Technology, Trondheim, Norway
Veli-Pekka Räty – Department of Electronic Systems, NTNU, Norwegian University of Science and Technology, Trondheim, Norway
Taco Nieuwenhuis – never.no, Oslo, Norway
Christian Keimel – IRT, Munich, Germany
Francisco Ibanez – Brainstorm Multimedia, Valencia, Spain
Andrew Perkis – NTNU Norwegian University of Science and Technologies, Trondheim, Norway
Abstract: Traditional linear TV is decreasing in popularity and the broadcast industry has identified the need to change communication with their audience as a way to counteract on this. Especially younger generations are using social media twenty-four-seven and would like to continue doing so while watching TV. The VisualMedia project accepts this challenge by enriching the TV experience with elements and information from social media in real-time during live broadcasts. It enables broadcasters to select and distribute posts and stats from various social media sources in a fast and reliable way. With the help of VisualMedia, posts and stats can be shown within live programs using enhanced graphical representations or on the second screen, with minimal latency. The goal of the project is to deliver a framework that combines all necessary steps from retrieving posts to delivering them into a live TV show. This gives broadcasters the opportunity to react fast on their audiences, and facilitates audiences to interact with their broadcaster.
- Multi-Screen Director: a New Role in the TV Production Workflow?
Britta Meixner – Centrum Wiskunde & Informatica (CWI), Amsterdam, Netherlands
Maxine Glancy – BBC Research & Development, BBC, Manchester, United Kingdom
Matt Rogers – BBC Research & Development, BBC, Manchester, United Kingdom
Caroline Ward – BBC Research & Development, BBC, Manchester, United Kingdom
Thomas Röggla – Distributed Interactive Systems, Centrum Wiskunde & Informatica, Amsterdam, Noord-Holland, Netherlands
Pablo Cesar – Centrum Wiskunde & Informatica, Amsterdam, The Netherlands
Abstract: Multi-screen applications have been a research topic for the last 10 years. Recent technical advances make authoring and broadcasting of interactive multi-platform experiences possible. However, most of the efforts have been dedicated to the delivery and transmission technology (e.g., HbbTV2.0), and not to the production process. The hypothesis of this article is that broadcast television requires radical changes in the production workflow, in order to allow for an efficient production of interactive multi-platform experiences. This paper explores such changes, investigating workflows and roles, and identifying key requirements for supporting these. The final objective is to create a new set of tools, extending current processes, that allow broadcasters to curate new types of experiences. The methodology was to conduct a set of interviews with television producers and directors, that allowed us to identify two major (sub-)workflows, one for pre-recorded and one for live experiences. We could then assign roles to the different stages of the workflows and derive a number of requirements for the next-generation of production tools.