WHP 395 White Paper WHP 395: Accessible object-based audio using hierarchical narrative importance metadata Lauren Ward, Ben Shirley, Jon Francombe Designing a prototype metadata-driven tool to help listeners with hearing impairments personalise audio mixes for programmes. White paper published Jan 2021 White Paper WHP 391: The Impact of New Forms of Media on Production Tools and Practices Lauren Ward, Maxine Glancy, Sally Bowman, Michael Armstrong Surveying production teams’ perceptions of personalisable media and its integration into existing workflows. White paper published Sep 2020 White Paper WHP 390: Object-Based Media: An Overview Of The User Experience Maxine Glancy, Lauren Ward, Nick Hanson, Andy Brown, Michael Armstrong Examining range of user experiences from several Βι¶ΉΤΌΕΔ R&D object-based media productions, drawn from audiences responses. White paper published Sep 2020 WHP 385 White Paper WHP 385: Investigating user interface preferences for controlling background-foreground balance on connected TVs Lawrence Pardoe, Lauren Ward, Hannah Clawson, Aimee Moulson, Chris Pike Exploring interface types and control for object-based audio experiences on connected TVs White paper published Aug 2020 Creating Jade's World Using binaural recordings, hearing aid simulators, actual hearing aids, and sound design to create a special episode of Casualty. Blog post on 10 Jul 2020 PhD Thesis: Improving broadcast accessibility for hard of hearing individuals : using object-based audio personalisation and narrative importance Lauren Ward Examining using object-based audio and narrative importance in creating personalisable audio mixes for those with hearing impairements. Research paper published Jun 2020 DAFx 2019 DAFx 2019: Modelling Experts’ Decisions On Assigning Narrative Importances Of Objects In A Radio Drama Mix Emmanouil Theofanis Chourdakis, Lauren Ward, Matthew Paradis, Joshua D. Reiss (Queen Mary, University of London) Automating creation of customizable object-based audio mixes where users can attenuate parts of the mix with a simple complexity parameter. Research paper published Sep 2019 Interspeech 2019 Interspeech 2019: R2SPIN: Re-Recording the Revised Speech Perception in Noise Test Lauren Ward, Catherine Robinson, Matthew Paradis, Katherine M. Tucker, Ben Shirley Updating the R2SPIN test and outlining a new methodology for re-recording legacy material to ensure their future usability. Research paper published Sep 2019