Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Synthetic humans, 5G motion capture, and data distribution among IBC Accelerator Programme 2023

Proof of concept demonstrations of all the projects will be presented live on the Innovation Stage at IBC 2023, taking place from 15th-18th September

Projects involving synthetic humans for the metaverse, 5G-enabled immersive experiences, sign language accessibility, and metadata-powered content targeting are among the finalists in this year’s IBC Accelerator Media Innovation Programme.

Proof of concept demonstrations of all the projects will be presented live on the Innovation Stage at IBC 2023, taking place at the RAI Amsterdam from 15th-18th September.

The 2023 projects that will be showcased include:

  • The Authenticated Data Standard aims to define a standardised data distribution package to ensure that, as programming is distributed and works its way through the entertainment landscape, content owners can maintain the original quality, messaging and intent. The project will allow them to verify and authenticate their content data and imagery and publish it to third-party sources. It will also build event-driven integrations that allow metadata to pass between systems.
  • Synthetic Humans for the Metaverse employs a range of leading-edge technologies and archived materials intending to generate photorealistic avatars that can be integrated in a virtual production with real guests. To animate the avatars in real time, the project team will employ body motion, cloned voice or original audio while drawing on artificial intelligence, machine learning, and augmented and virtual reality. The project will also align with a second aim: to build a foundation for broadcasters to create ‘virtual translators’, using avatars and sign language for accessibility services and other functions.
  • Real-Time XR Sport Edge takes 5G XR to the edge to build on innovations in live motion capture and high-speed content delivery. The project aims to broadcast extended reality (XR) sports, including Mixed Martial Arts (MMA) and augmented reality (AR) techno-sports in an immersive environment with high-end photorealistic graphics, virtual advertising, spatial audio and social interactive audio – delivering the experience to 3D worlds and metaverse audiences in real time, at the lowest possible latency via virtual reality (VR) headsets, computers, mobile devices and over-the-top (OTT) platforms.
  • Connect and Produce Anywhere intends to build a distributed edge & cloud computing system to remotely produce a live sports event. By deploying 5G for connectivity and utilising software, the project aims to make the most efficient use of resources in bandwidth-constrained locations. The aim is to detach software from hardware and deploy a distributed computing architecture between ground and cloud, exploring the benefits, challenges and the sustainable potential of such an approach.
  • Responsive Narrative Factory plans to deliver the right narrative for any consumer in real time via a metadata-powered content fast-track. It will demonstrate a new component-based approach to quickly and cost-effectively creating multiple versions of content from a single master to enable precision targeting of programs to different demographics or regions or groups that can be monetised for premium Free Ad-Supported Streaming TV (FAST) advertisers.
  • 5G Motion Capture for Live Performance and Animation explores using 5G to build ultra-low latency networks to support the creation of new immersive audience experiences for those present at a live performance or engaging remotely at another venue. Video ‘illusions’ will be piped from ‘anywhere’ to ‘everywhere’ over appropriate backhaul, using terrestrial or celestial Public Internet. The project also seeks to leverage 5G technology to bring joyful, interactive animated characters to children in hospital wards regardless of location.
  • Gallery Agnostic Live Media Production aims to bring all media productions into the modern day via device-agnostic, gallery-agnostic and hybrid ways of working to prove control of both existing on-prem and cloud devices. Making shows should be gallery and device agnostic, so that the industry can adapt to current budgets, technical possibilities and a variety of circumstances like our venue or location.
  • Real-Time Interactive Streaming Personalises Live Experiences project proposes to demonstrate that additional revenues and return on content and rights investments can be achieved through personalising viewer engagement from live interactive sports streaming, big or small. It will explore and create a next-generation sports viewing experience with real time interactivity, as well as new revenue streams for content providers that have acquired expensive premiere rights for entertainment events and sports and other live events, among other objectives for a POC.

“This year’s Accelerator Programme introduces some exciting new initiatives while also featuring projects building on the tremendous innovations made in previous years,” said IBC Innovation Lead Mark Smith.

“It is great to see the array of dynamic industry pioneers coming together this year to explore the potential for transformative innovation in areas such as connectivity anywhere for remote productions with cloud and edge workflows, synthetic human avatars, immersive sport, metadata-driven content targeting, and live, 5G-enabled motion capture performance art and animation. Each year the programme brings new challenges, and we are seeing a host of household media and entertainment players and technology leaders, as well as younger start-ups stepping up to collaborate and drive forward an array of media R&D solutions.”