Βι¶ΉΤΌΕΔ

Research & Development

Posted by Lauren Ward, Jack Reynolds, Matthew Paradis on , last updated

This week marks the release of a very special episode of Casualty. The episode focuses on Jade, a series regular, who is deaf and wears hearing aids. Through scenes from Jade's audio perspective, we introduce viewers to her unique way of experiencing the world – termed 'Jade's World'. The story draws directly from the lived experiences of the writers' (Charlie Swinbourne and Sophie Woolley), director (John Maidens) and members of the Casualty cast. It also explores how the way we hear the world is in constant flux, day to day, moment to moment, based on how we feel or what life throws at us.

Before this episode was even written, the Casualty team invited Βι¶ΉΤΌΕΔ Research & Development to collaborate. The aim was to find ways we could support the storytelling through our audio expertise and the technologies we have developed. And importantly, to help make the episode as accessible as possible.

Hearing loss is not simple. It doesn't just make the sounds an individual hears quieter; it can change their tone, add noise, blur together speech sounds and make it hard to differentiate one sound from another. Also, hearing aids, while useful devices, also alter the way an individual perceives the world (and not always for the better). To create the most realistic version of Jade's world possible, we needed to integrate several factors; namely, what we understand about hearing loss as researchers, and individuals' lived experiences of hearing loss in the real world.

To achieve this, we used a variety of different techniques to create 'Jade's World' and convey it to the audience. This included binaural recordings, hearing aid simulators, actual hearing aids, and sound design. We have been collaborating on and off set with Casualty, during all stages of the episode's production, to introduce them to these new recording techniques, consult on sound design and produce additional content for the programme. We have then backed this up by a suite of accessibility features, both standard, like subtitling and audio description, and new techniques still in development, like personalised, accessible audio.

Binaural Audio

The binaural 'head' used for recording, placed on top of a camera while filming a scene from Casualty on set.

The binaural 'head' while filming on the set of Casualty.

We used several binaural techniques, including our binaural head which has a microphone fitted in each of its artificial ears. Microphones in a TV studio usually only pick up sounds from a particular direction, whereas the microphones in the head are omnidirectional, similar to human ears. The solid object of the head between the two microphones causes a difference in the sound heard in each ear, and the time it takes to reach each ear. This is one of the ways humans know what direction a sound is coming from - if a sound is coming from the left, it will be louder in the left ear.

Another layer of realism was added by using artificial 3D reverberation software to produce a sense of distance. One of the ways we know how far away a sound source is, is the amount of reverberation, or secondary reflections of the sound from surrounding surfaces, compared to the direct initial sound that arrives at our ears. More reverberation is generally associated with a sound source being further away.

We also created a short binaural piece for social media, which contained a further stage of processing applied to the binaural mix with the . The toolkit was developed by the and and has functions for simulating hearing loss and impairment and can simulate modern hearing aids, applying various processes to the input to increase the audibility of external sounds.

Hearing Aids

Our binaural head wearing a digital hearing aid.

Our binaural head wearing a digital hearing aid.

To add to the realism of Jade's World, we fitted our binaural head with hearing aids similar to Jade's own to capture the effects the devices had on all the dialogue Jade hears or speaks.

The provided these hearing aids. They also programmed them to the settings that Jade would likely have (hearing aids are uniquely tailored to their wearer's hearing profile). Digital hearing aids have improved a lot over the last decade, but still sound different to how unaided ears capture sound.

Jade's hearing loss is more severe at the higher frequencies. This requires her hearing aids to boost the volume of higher-pitched sounds more so than lower-pitched sounds. In this episode, you will hear higher-pitched sounds as far more processed than lower-pitched sounds, illustrating this boosting effect.

We began recording using the hearing aids in early April, which, because of the Coronavirus lockdown, meant we were all working from home. This posed a challenge - instead of recording in our nice, quiet listening room at the Βι¶ΉΤΌΕΔ in Salford we had to make do with a haphazard listening hallway in my flat. Cue dismantling my couch cushions, assembling every blanket and duvet we owned and covering the walls with these soft furnishings to make the space soak up as much of the sound and its reflections as possible. Think of it as a scientific pillow fort.

Lauren's work from home set up for recording the dialogue through the binaural head and hearing aids.

Lauren's work from home set up for recording dialogue through the binaural head and hearing aids.

Unaided hearing

In the episode, we get to hear the world from Jade's perspective when she is both wearing her hearing aids, and when she has them out.

To simulate her unaided hearing, we used a hearing-loss simulator. These are often developed as research tools to explore the perceptual and physiological effects of hearing loss. While it is impossible to capture every unique aspect of a persons' hearing, they can effectively represent some of the key aspects of hearing loss.

A still from the episode, shot of Jade’s hearing aid in a glass jar.

A still from the episode, shot of Jade’s hearing aid in a glass jar.

between and Βι¶ΉΤΌΕΔ R&D. This simulator not only replicates the reduction in the volume of sounds but other hearing loss effects as well. Two such effects are loudness recruitment (reduced dynamic range) and spectral smearing (where different tones all sound similar and blurred together).

Sound design

We know a lot about hearing loss, but there's even more that we don't yet understand. To convey Jade's tinnitus and hyperacusis, we had to get creative with the sound and visual effects.

Hyperacusis is heightened sensitivity to sound. People with hyperacusis often find everyday sounds, like telephones or slamming doors, uncomfortably loud and painful. Hyperacusis can occur with or without hearing loss. Jade's hyperacusis is often triggered by heavy hospital doors slamming, among other sounds. This is conveyed through emphasised sound and blurred visual effects (see the image below), to express how she is being affected.

Still from the episode of Jade experiencing hyperacusis, emphasised by a blurred visual effect.

Still from the episode of Jade experiencing hyperacusis, emphasised by a blurred visual effect.

Tinnitus is a condition where you hear ringing or buzzing in your ear, without an outside source. What this sounds like, and how loud it is, differs from person to person. Stress and tiredness also have a significant role to play in how severe an individual's tinnitus is at any one moment. During this episode, you'll notice Jade's tinnitus comes and goes, becoming more prominent as she becomes more stressed and emotional.

Accessibility

Βι¶ΉΤΌΕΔ R&D has been researching ways to exploit new technology to improve the accessibility of programmes. The approach used here, personalised, accessible audio, was initially developed as part of a collaboration with the .

This approach allows the user to adjust the audio balance with a slider in the bottom of the media player. Selecting 'TV Mix' will play the same mix as heard on TV, but choosing 'Accessible Mix' will enhance the dialogue while some of the other sounds will be quieter. You can adjust between these two extremes to get the right balance of dialogue, important sounds and atmosphere for you.

Βι¶ΉΤΌΕΔ Standard Media player with personalised accessible audio slider control set to ‘Accessible Mix’.

Βι¶ΉΤΌΕΔ Standard Media player with personalised, accessible audio slider control set to Accessible Mix.

Casualty was involved in the first public trial of this technology last year. 6,228 people tried it, and 83.6% of you told us that it made a difference. Specifically, 73.7% said it made the content more enjoyable or easier to understand.

Working on this episode of Casualty has allowed us to learn more about how to create this kind of content and how to integrate it into current production workflows. We'll also learn more about how people like to use the control while watching content, allowing us to refine how it functions.

Acknowledgements:
Dr. Lauren Ward, Dr. Matthew Paradis and Jack Reynolds (Βι¶ΉΤΌΕΔ R&D Engineers)
Dafydd Llewelyn (Casualty Producer)
Loretta Preece (Casualty Series Producer)
John Maidens (Director, Casualty Episode 36)
And the Casualty Post-Production team: Lou Prendergast, Laura Russon, Ravi Gurnam and Olivia Waltho

-

Βι¶ΉΤΌΕΔ R&D - Casualty, Loud and Clear - Our Accessible and Enhanced Audio Trial

Βι¶ΉΤΌΕΔ Taster - Casualty: A&E Audio

Βι¶ΉΤΌΕΔ R&D - 5 live Football Experiment: What We Learned


Immersive Audio Training and Skills from the Βι¶ΉΤΌΕΔ Academy including:

Sound Bites - An Immersive Masterclass

Sounds Amazing - audio gurus share tips

This post is part of the Immersive and Interactive Content section

Topics