Βι¶ΉΤΌΕΔ

Research & Development

Posted by Stephen Jolly on , last updated

Our team here at Βι¶ΉΤΌΕΔ R&D has spent the last couple of years working on ways to take advantage of of the increasing number of smartphones, tablets and laptops in the homes of Βι¶ΉΤΌΕΔ audience members to enhance our TV and radio programmes and the way people interact with them. I've blogged previously about this area of work, including the Universal Control API, and the ways in which it could replace remote controls and enable what my colleague has called "orchestrated media" experiences, which consist of media presented on multiple devices that are synchronised to one another.

In that earlier blog post, I mention that technologies exist that are already being used to synchronise media on mobile devices to television programmes, such as and watermarking, and watermarking and delivering synchronisation information via the Internet. The advantage of all these solutions is that they require no modifications to the set-top box or television. A common disadvantage is that content on other devices can only follow what happens on the television, and not vice-versa. In the longer term, we believe that a technology like Universal Control offers very significant advantages in this regard, but we recently took advantage of an opportunity to work with colleagues from across the Βι¶ΉΤΌΕΔ to investigate some of these existing methods of synchronisation, to see what kinds of "dual screen" experience might be possible today.

Our most significant contribution to the work has been an API to permit the developers of dual-screen applications to ignore the details of specific synchronisation technologies. It provides a standard interface, behind which any number of information sources may be working (individually or together) to provide the application with information about what programme the user is watching (if any), and what events are occurring in it that might trigger synchronised behaviour. This approach helps the Βι¶ΉΤΌΕΔ avoid getting locked into using the technology of a specific supplier, and helps "future-proof" applications: as new synchronisation technologies become available, little or no extra effort is likely to be required for existing applications to be able to make use of them. (Of course, one of the sources of information could be a Universal Control server on the set-top box...)

A high-level overview of the sync API. A library implementing the API can be embedded into Βι¶ΉΤΌΕΔ apps and websites to provide them with information about the programme a user is watching on their television and trigger events at predefined points within that programme.

This autumn, we've been testing our API design as part of a closed Βι¶ΉΤΌΕΔ pilot accompanying the National Lottery show . Up to two hundred viewers have been taking part, playing along with the quiz on their smartphone or tablet devices to see whether they can do better than the people playing the game in the studio.

Although we have learned many things from the pilot about the practicalities of this kind of dual screen experience and how they can improve the programmes they accompany, from our R&D perspective the most important lessons have been regarding the performance of the synchronisation technologies and the usefulness of our API. For this pilot, we chose to test two technologies: an audio watermarking system from a commercial supplier, and an -based system for delivering synchronisation information via the Internet that we developed ourselves, using , and written by of the .

The Secret Fortune dual-screen application in action.

Even just from this technical perspective, we learned a number of useful things from the pilot. For example, audio watermarks take a finite amount of time to detect, which prevented us from implementing any watermark-driven synchronised behaviour at the very beginning of a programme (or programme segment). Also, while it is obvious that signalling events in programmes from a central server on the Internet can only work for people watching the programme as it is broadcast, it is perhaps less obvious that people watching on different broadcast platforms (eg Freeview, Freesat or analogue TV) see a given part of the programme at slightly different times, which reduces the synchronisation accuracy accordingly.

From the perspective of our own work, we have been very happy with the performance of our prototype API implementation. The team developing the play-along smartphone and tablet app found it extremely straightforward to integrate, and the ability to turn synchronisation sources on and off by changing a web-based configuration file has been critical to testing those different sources without requiring members of the pilot to install new versions each week.

In addition, by using the abstraction inherent to the API to decouple the times at which "events" (such as the start of a question) are triggered from the times at which the audio watermarks are detected or the XMPP messages are received, we have been able to improve the robustness of the experience (by allowing an event to be triggered even if certain watermarks or messages are missed). At the same time, this gave us the opportunity to fine-tune the timing of events right up to the point of broadcast.

Our existing API is inherently asymmetric: it only delivers event information in one direction: from the television to the mobile device. Over the course of the next few months we will be identifying technologies, protocols or new APIs that could bring more 'symmetric' experiences to the user - look for more blog posts on this subject in the next few months. We will also be working on documenting our work and the knowledge we've gained from the pilot - just in case the Βι¶ΉΤΌΕΔ decides to take this idea further...

Topics