Hearing impairment affects around one in six people in the UK. And when it comes to watching television, that can cause speech to be jumbled and competing sounds can make listening far from comfortable.
BBC One’s long running hospital drama Casualty on 11 July will feature a character who has a hearing impairment. But, in addition to a thoughtful and informative storyline, the episode will allow those viewers with hearing difficulties to watch the programme on an internet-connected device with the benefit of object-based media.
But, just what is object-based media?
“It is a system that provides viewers with personalised audio content via the internet,” explains Lauren Ward. She has been working with BBC’s Research and Development department while pursuing her PhD at Salford University. “With such a large proportion of the population experiencing hearing problems, I believe there’s a responsibility for anyone in the broadcast chain to try to make content as accessible as possible.”
She continues, “As part of my PhD work, I have been able to study how new broadcast technology can be used to make a better experience for a broad range of people. In fact, using object-based media is a method of creating an audio mix that helps people who may not have hearing difficulties to appreciate audio content in situations where there is high background nois; for example on public transport or with children running around. So, it can benefit a whole range of viewers.”
Put briefly, the process involves placing the audio element of each scene in the programme in order of importance and ensuring the vital portions are easy to hear. In that way, the viewer can interact with what is being shown on screen. Dialogue is, of course, at the top of list, but in a dramatic setting, such as a hospital in Casualty, elements such as a heart monitor bleeping (or not) would be a high priority. Everyday background noise would be near the bottom of the list. Each of these elements are the ‘objects’ in object-based media.
“Of course, immersive sound has been around for some time and is starting to be used more widely in sporting broadcasts to make the viewer feel part of the crowd,” says Ward. “But what we are doing here is going beyond just separating the commentary from the crowd – which is simply a binary application – and providing a much higher level of personalisation and complexity in situations where there is a dramatic soundscape.”
With all of that in mind, how does the workflow for creating such a production work? According to Ward, it has less to do with technique, and more good practice. “It relies on each of the parts of the sound being recorded as cleanly as possible without any spill between tracks. We talked about the ‘objects’ earlier, and during the post-production these objects are tagged with metadata. This metadata, which describes different aspects of the object allowing how it is reproduced at point of service to be varied, is what facilitates the viewer being able to personalise the content
Although the various tracks are used in both the broadcast mix and the object-based version, they are applied in different ways and at different levels.
Ward says that creating such clean tracks might be something with which the industry will have to grapple as object-based productions, be that for accessible content or immersive sound, become more widespread.
This is the second such programme – the first was also from the Casualty series – with which Ward has been involved, and earlier lessons were learned which made this upcoming episode easier to complete.
“Last time it took a great deal of time unpicking the mix that had been created for the regular programme, and then putting it all back together based on object-based media techniques of defining importance. In fact, it took more than ten hours of dubbing using Pro Tools. This time, using the experience of the first episode, the whole dubbing process was able to be completed much faster.”
Ward believes that producers will need to bear in mind the needs of those viewers with hearing difficulties as part of the way forward to improve their experiences. And that may require training on how to map out the audio needs and how to integrate them into the normal workflow.
“The production tools likely won’t change dramatically,” Ward says, “but what we will have to change is how we plan and think about post production.”
As already mentioned, this episode of Casualty features an individual with hearing loss. “Around a third of the episode will feature the story of this character. I was able to work very closely with the production team and we recorded some of the dialogue as it would have been heard by this individual to illustrate the effects which hearing loss and hearing aids have on how an individual hears the world. Some of this audio was created by making recordings using a binaural dummy head [pictured], which was wearing hearing aids programmed to similar settings as the character would have. The aim was to create a soundscape which reflected the aspects of that character’s hearing as realistically as possible.”
For viewers watching on an internet-connected device, a slider is provided which enables them to adjust the level of the audio to suit their own particular needs. At one end of the slider is the full broadcast mix, moving down at varying levels to reduce or increase the various elements.
“At the moment, in the UK, we can only really achieve this via the internet. The time will come when this will be available via a conventional television receiver, but that is not here yet. Some countries are ahead of the game and already broadcasting in object-based compatible format. We’re not there yet in the UK but it will eventually come.”
Working with Lauren Ward has been
- Dr. Matthew Paradis and Jack Reynolds (BBC R&D Engineers)
- Dafydd Llewelyn (Casualty Producer)
- Loretta Preece (Casualty Series Producer)
- John Maidens (Director, Casualty Episode 36)
- And the Casualty Post-Production team: Lou Prendergast, Laura Russon, Ravi Gurnam and Olivia Waltho