Advances in real-time rendering and delivery technologies pioneered in the games and visual effects industries offer broadcasters the opportunity to deliver completely personal experiences.
Project from 2019 - present
Why does it matter?
Advances in the technologies used in visual effects, games and virtual worlds give audiences experiences that are personal, responsive, interactive and immersive.
Modern audiences now enjoy media on an ever-increasing range of devices. Every year these devices gain new capabilities and are now able to play object-based, virtual or augmented reality experiences with ease.
The Internet is also becoming the primary means for distributing the content that people experience. Audiences expect their media to be delivered wherever they are. It can be tailored to the moment they are in and the device they are using.
In broadcasting, we are used to delivering one fixed, linear experience to many people at once. Yet there is an argument that giving more freedom and flexibility to media would allow for more personal experiences. ΒιΆΉΤΌΕΔ Research & Development have been pursuing this as a research priority: object-based media. The hard question is how we do this at scale, for very large audiences.
Can we transform broadcasting to take advantage of new technologies, new media forms and growing audience expectations?
What is the Challenge?
Technology is transforming the craft and tools used to make media and the devices we use to consume it. The experiences people have are no longer just audio-video files delivered to a playback appliance. Delivering even βsimpleβ linear media uses combinations of several media objects, execution logic and intelligence to stream and adapt to a wide variety of devices, capability and user preference. We anticipate that shortly all media experiences will be created and delivered more like packages of interactive software than traditional file-based broadcast media.
The ΒιΆΉΤΌΕΔ has previously focused on audio and video distribution services over the Internet. New technologies now offer the opportunity for these services to evolve and cater to new audience behaviours.
Imagine a ΒιΆΉΤΌΕΔ Sounds or iPlayer of the future. To deliver multimedia experiences such as these, services would need to still play audio and video as we do now, yet:
- in even higher quality and fidelity,
- play other content types including object-based experiences,
- support a growing range of devices,
- tailor and personalise the content appropriately to the device and the user.
All delivered from a personalised playback interface and ideally created as one software codebase.
How might we achieve this? Is it possible to still broadcast to millions, and yet tailor that experience to be unique to individuals? Could we move from a one-to-many, broadcast style of media to a , or style of media? Can we deliver many content types and formats to any device new or old? Is this achievable in a single ecosystem that scales without increasing cost? Can we embrace the same trend for streaming interactive content as or ? Are there new approaches beyond interactive video?
We are taking these questions and more into account in our work.
How Does it Work?
We are taking inspiration from media that is already software-based - immersive and interactive entertainment such as video games - as well as modern approaches to software distribution over the Internet. The approaches used to code and deliver these experiences are constantly developing and they provide some of the solutions to the challenges of at-scale responsive and personal media.
For example, consider . To reach more customers with high-quality game experiences providers such as Google are looking to leverage high-speed internet and the video streaming backbone that delivers services like . The aim is to remove the requirement for a powerful console and deliver to people on any device with a web browser. They do this by running the games on a cloud server, outputting as video, which is streamed to the much less powerful device. The device in turn sends back data β e.g. from a gamepad β to the game. Similar technology could be used to deliver interactive scenes through a future version of a service like iPlayer and allow viewers on a phone or smart TV to explore ΒιΆΉΤΌΕΔ content in more depth.
Remote game streaming, in itself, is not enough. We are looking into services that adapt to the computing capability in the userβs devices and supplement this with computing power from remote servers. New techniques from and the mean rendering and compositing could be targeted more sensibly: relying on high bandwidth video streaming for interactivity is not a future-fit solution.
On top of this, we also want to reduce the amount of software that has to be written to target all these devices and adapt to users and context responsively β so we are looking at new approaches like to βwrite once and run everywhereβ.
What are we Doing?
We are already researching and developing means for the ΒιΆΉΤΌΕΔ to author, distribute and playback these new software experiences. We plan to use what we learn to suggest new media standards for how to make, distribute, play and archive them. We are embracing existing workflows and tools to develop a solution that fits with existing production forms.
Our team has been working on a series of technical tests and demonstrators to explore what a new broadcasting system for these experiences must support. We want the new system to be low-cost as well as simple to develop and grow. It must be climate-friendly, offer scalable distribution of new content types to millions of devices and people, be sustainable and easy to extend.
We have developed a βSingle Service Playerβ that:
- switches seamlessly between video and interactive experiences that are written in game engines (like Virtual Reality) or our very own StoryKit, used to make ΒιΆΉΤΌΕΔ Clickβs recent interactive episode;
- switches between remotely streamed and locally rendered content according to the capability of the device;
- optimises performance for the target device using βwrite-once run-anywhereβ techniques from video game development that we have adopted;
- And plays all your favourite ΒιΆΉΤΌΕΔ content too.
A version of the player is now in use by other ΒιΆΉΤΌΕΔ R&D teams to test it on a range of new content and service experiments.
- ΒιΆΉΤΌΕΔ R&D - Creating a Personalised and Interactive ΒιΆΉΤΌΕΔ Click with StoryKit
- ΒιΆΉΤΌΕΔ Taster - Try Click's 1000th Interactive Episode
- ΒιΆΉΤΌΕΔ R&D - Object-Based Media Toolkit
- ΒιΆΉΤΌΕΔ R&D - 5G Trials - Streaming AR and VR Experiences on Mobile
It was used to evaluate the benefits of game streaming in a public trial as part of our βSmart Tourismβ project in 2018. The project recreated the in a game engine for an augmented reality experience. Using a 5G mobile phone network testbed in Bath, we ran the rendering jobs in the cloud and streamed the ancient artefacts to devices. Users were able to navigate through the simulation on mobile phones, sending data back to the cloud to remotely draw a 3D historical recreation of the scene based on the phoneβs location and orientation.
We are conducting further tests to determine the limit for acceptable loss in quality in remote game streaming. By integrating remote game streaming in our multi-format player, we expect experiences to play on both high-powered and low-end devices without sacrificing a lot of their richness. This meets our aim of allowing all our audiences to be able to access our services on whichever device they have and at the best quality possible.
Our next goal is to develop examples of experiences that customise to the capabilities of devices by making smart decisions on where to run the code β locally or remotely and to ensure this scales.
- -
- ΒιΆΉΤΌΕΔ R&D - Object-Based iPlayer - Our Remote Experience Streaming Tests
- ΒιΆΉΤΌΕΔ R&D - Where Next For Interactive Stories?
- ΒιΆΉΤΌΕΔ R&D - Storytelling of the Future
- ΒιΆΉΤΌΕΔ R&D - StoryFormer: Building the Next Generation of Storytelling
- ΒιΆΉΤΌΕΔ News - Click 1,000: How the pick-your-own-path episode was made
- ΒιΆΉΤΌΕΔ R&D - Object-Based Media Toolkit
- ΒιΆΉΤΌΕΔ R&D - How we Made the Make-Along