Βι¶ΉΤΌΕΔ

Research & Development

What we're doing

Around 10% of television viewers in the UK use subtitles on a daily basis and many more now use them when watching clips and programmes online. They help many people enjoy television where they cannot have sound turned on, as well as being an access service for people with hearing difficulties or language issues.

For the past four years we have been looking into the ways in which the quality and quantity of subtitling can be improved for our audiences. Our work benefits from our being part of a public service broadcaster. We have access to the resources of the Βι¶ΉΤΌΕΔ including its audience research, programme archives and with help from production teams we can also create bespoke content material for our tests.

Audience surveys have helped us understand how people use subtitles. 90% of people who watch television with subtitles do so with the sound turned on. They use subtitles in combination with sound and lip reading to follow the programme. The surveys also help us design our User Research to model how people use subtitles at home. We carry out research in a purpose-built lab that replicates a living room environment. We recruit representative groups of subtitle users to take part in our research and using opinion scores and structured interviews we can build up a detailed understanding of the experience of using subtitles.

Subtitle availability

The most important issue for our audience is the availability of subtitles, and our focus now is on ways in which we can provide subtitles for the many thousands of video clips on the Βι¶ΉΤΌΕΔ's web pages. We have been developing ways of matching up video clips on our web pages to the original broadcast programme in our archive and then locating the matching subtitles for the web video. Our first prototype focused on the News web pages where it is able to find matches for around 40% of the video clips. We have applied for a patent for our technique and it was written up as a paper at the NAB 2015 conference, "”. The disadvantage of this approach was that it required human intervention to verify, and if necessary, edit the subtitles it produced because the subtitles being recycled had been produced live.

We have now developed an approach that is entirely automatic, capable of finding and verifying subtitles for video clips that were sourced from pre-recorded programmes. The work has produced a sucessful a proof of concept implementation which has been tested against a set of over 7,000 video clips from the Βι¶ΉΤΌΕΔ Bitesize web site. The system returned matching subtitles for around 47% of the clips. Further tests with other Βι¶ΉΤΌΕΔ brands indicate that around half of the video clips on the Βι¶ΉΤΌΕΔ web site could be subtitled in this way. The key issue here is to enable the production of subtitles for web clips without requiring human intervention, enabling tens, if not hundreds of thousands of video clips to be subtitled at a marginal cost. This work has now been written up as a paper "" which was presented at IBC2016 in the .

Subtitle quality

we published two papers on subtitle quality.

The first was called "" and was based on a set of user research we carried out in March and April 2015. The tests used specially shot news stories, which were read at a series of different word rates along with a series of off-air clips. The results showed that subtitle users want the subtitles to match the word rate of the speech even when the rate of the speech far exceeds current subtitling guidelines. Indeed, the rating of the speed of the news clips by subtitle users was closely matched by the rating of hearing viewers when watching without subtitles. This paper was amongst the top eight in the conference and was published in the journal '' and James wrote a blog post to go with the paper called This work has overturned previous claims from some academics and supports the working practices of much of the industry which has been .

The second paper, "" gave an overview of our subtitles research covering the past two years, bringing together work which had been published at academic conferences with developments in our understanding of the viewers' experience of subtitles.

Previous work

At IBC 2015 we also demonstrated our work on Responsive Subtitles, which showed the potential for subtitles to be formatted into blocks in response to the device capabilities and user input. This work was first presented at the Web for All conference in May 2015, in our paper "".

In December 2014 we carried out a programme of user research in collaboration with who was on placement with Βι¶ΉΤΌΕΔ R&D from The University of Dundee at the time. This research looked at how subtitles could be presented with a video clip on a web page, adjustment of subtitle size and follow-up work on . This work is in the process of being written up as a series of papers. The first of these papers, "" was presented at in June 2015 and the second paper, "" was presented at in October 2015, along with a short paper, "", which was part of the session.

When we first started this work we began by looking into ways in which we might use language models for individual programme topics to improve the performance of speech to text engines and to detect errors in existing subtitles. We had some early success modelling weather forecast subtitles, which suggests there may be some value in this approach, but it would appear that other topics would be less successful. See White Paper WHP 256: "" for more details.

Then, at the request of our Technology, Distribution and Archives, Solution Design team we carried out a ground-breaking study into the relative impact of subtitle delay and subtitle accuracy. This work required the development of new test methodologies based on industry standards for measuring audio quality. A user study was carried out in December 2012 with a broad sample of people who regularly use subtitles when watching television. The results were presented at IBC2013 in September and are available as White Paper WHP 259: "". Following on from this work the Βι¶ΉΤΌΕΔ and its subtitling partners have been making significant improvements to live subtitles available on news bulletins by using the presenter’s scripts to create the subtitles. This can result in news bulletins which have word-for-word subtitles presented without delay and without errors. Further work by Trevor Ware and industry partners has lead to the development of a way of retiming live subtitles by making use of the video coding delay. This work has been published as a white paper, "".

Βι¶ΉΤΌΕΔ Audiences have conducted surveys for us to provide background data on the level of use of subtitles and how people are using them and what issues they have. More recently we have started to examine the iPlayer statistics on subtitle use as they have the potential to give us insight into the use of subtitles on a programme-by-programme basis. This data is in the process of being verified and should be made public in the coming months.

We have also built  to allow us to track long term trends with issues that we can measure, such as position and reading rate, as originally outlined in White Paper WHP 255: "".

 

This project is part of the UX work stream

Topics

People & Partners

Project Team