Βι¶ΉΤΌΕΔ

Research & Development

Posted by Matthew Brooks, Tristan Ferne on , last updated

A lot of our work in 2016 has focused on structured metadata and objects for stories, and how we might represent these on the web in new ways, with projects looking at structuring news stories and developing new formats, a production use for drama data, a website for catching up with your favourite drama series, and a cook-along learning experience.

The year started with a presentation at the in Helsinki - talking about object-based and structured stories.

We continued to develop . It was noticed by our marketing department and we worked with them to deliver a in time for . This is a pretty simple but useful case of OBM, more suitable for catching up and exploring.

undefined
Meanwhile we were developing the prototypes for which evolved into with the . Most of the team got involved with this as we were developing the prototype into a pilot and integrating it with existing Βι¶ΉΤΌΕΔ systems. We’re now wrapping this up by running a short but highly-instrumented trial with a closed group of target users.

As well as News, we were also applying our ideas to existing  Βι¶ΉΤΌΕΔ dramas. We worked with Radio & Music to handover . This is now in use by the production team in making .

We got busy with creating a multi-screen, multi-home, interactive immersive home theatre experience. As well as finalising the architecture, we built , video chat that brings homes together during intervals, and a layout engine that can present content across multiple screens. We also presented a paper, “”, at .

We also made a big noise about , where we presented , and the .

The Βι¶ΉΤΌΕΔ R&D stand at IBC 2016.

is a film which changes based on the person who is watching the video. Rather than drawing on sensor data to profile the environment, it focuses on the user themselves. It uses profiled data from a phone application to build a profile of the user and their preferences via their music collection and some personality questions. The data then is used to inform which assets are used in which order, what real time effects are applied and ultimately when. Cinematic effects twist the story one way or another.

Squeezebox addresses the production of captioned montages of news stories. It allows users to specify how many news stories they’d like, caption and order those stories, and add ident and transition graphics. Squeezebox then enables users to adjust the duration of the story using a simple slider control. The purpose built algorithm establishes new in and out points per shot, and in some cases drops shots entirely.

An illustration of the editing concept when using the Squeezebox tool.
CAKE is a new object-based media experiment that uses . It customises recipes based on your familiarity with ingredients and methods, your tastes or dietary preferences, and how many people you’re inviting round for dinner. , allowing you to create new dishes at your own pace. Novices can level-up and experts can cut to the chase, supported by an evolving dialogue between audience and presenter.

The Cook Along Kitchen Experience

The three projects together formed a strong object-based message, with some real impact: CAKE was featured in , and our paper “” was one of the top 8 technical papers at the conference.

Towards the end of the year, we spent a lot of time debating and . We’ll use the toolkit for all of our object based media projects, and we want to share it too, with both creators and coders, to create a community of practice.  Exciting stuff, which we’ll be talking more about in 2017.

More on Object-Based Media:







Topics