Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Case study: Enhancing content production with next-generation workflows

How 4K/UltraHD, HDR, and mixed reality workflows are being employed across the production industry, from live sport to virtual production

Technological advancements are transforming the production of live event, broadcast, and cinematic content, enhancing the viewing experience for global audiences. As the content boom endures, a range of real-time, mixed reality, and next-gen technologies are essential for simplifying workflows and powering the creation of the most engaging UltraHD and HDR experiences. How are production teams across industries pushing the envelope?

RES creates visual spectacles via projection mapping

Based out of London, Realtime Environment Systems (RES) specialises in delivering stunning video and projection mapping for museum, concert, and live event projects. In translating client visions to realities, RES founders Dave Green, Trey Harrison, and Mark Calvert found themselves using a similar set of tools for video playback, and saw an opportunity to commercialise their work. They launched HIVE in March 2022 to develop cost-efficient, portable media control solutions for driving projection and LED displays in AV environments. 

HIVE offers four different media players, including PLAYER_3 and PLAYER_4, which support 4K and 8K playback and include an AJA KONA 4 and KONA 5 I/O card, respectively, in addition to an NVIDIA GPU. Systems integrators and venue teams simply connect the media player box to their display technology, and log into HIVE’s web-based application. An intuitive interface then allows users to upload video files for playout directly from a phone, tablet, or computer, and set up and/or edit the playlist or timecode cue list directly from the app.

Green says: “As the visual bar for video installations continues to grow, keeping media control workflows and costs in check has become in many ways unmanageable, which is where HIVE is providing respite.”

Quidich innovates game-changing camera solutions for live sports broadcasts

A leading Indian broadcast technology provider, Quidich specialises in innovative camera systems for live sporting events, film, and television. Quidich’s founders recognised an opportunity to engage casual cricket fans with the integration of more dynamic camera solutions to add new perspectives for viewers. Quidich was the first company in the world to develop a live AR tracking solution on a wireless moving camera, with the Spatio system for drones. The company further innovated with the BuggyQam remote controlled vehicle, offering fans unique and low angle shots during cricket matches

Quidich uses a wide range of AJA Video Systems equipment, including the FS-HDR real-time universal converter and frame synchroniser, to further enhance the audience viewing experience. An essential colour correction and HDR/SDR transform solution, FS-HDR allows Quidich to match the BuggyQam camera output with all system cameras for consistency during live broadcasts. BuggyQam houses a single sensor camera, whereas other cameras used throughout the production chain include three CCDs with separate sensors for receiving filtered red, green, and blue colours. FS-HDR enables the team to take the different camera profile raw formats and color match to the output cameras.

Quidich chief operating officer Neil Gokhale shares: “FS-HDR is a well-designed, versatile solution, which we’ve been able to take advantage of for real-time color transforms, ingesting multiple feeds in multi-channel mode, and many other use cases.”

Dimension Studio drives real-time virtual production workflows

With cutting-edge technologist Jim Geduldick at the helm of operations, Dimension Studio North America is redefining traditional production pipelines with the latest real-time technologies and virtual production workflows.

On set, pairing a real-time game engine with AJA’s KONA 5 high-performance PCIe video I/O card allows Geduldick’s team to mesh live elements and CG assets with virtual environments in real-time on an LED volume. “The essence of real-time workflows is that we’re not waiting for rendering like we used to,” explains Geduldick. “Instead, we’re pushing as much down the pipe through AJA hardware and other solutions to make everything as quick and interactive upfront as possible. AJA’s KONA 5 capture cards are our workhorses, as we use them to send high-bandwidth, raytraced assets with ease. This ultimately allows us to make on-the-fly creative changes during production, saving our clients time and money because we’re capturing in-camera, final effects, rather than waiting until post.” 

Geduldick continues, “For continuity and budgetary considerations, shooting on an LED volume offers the ability to capture the same time of day with the same color hue at any given time, allowing actors and crew members to focus more on the performance. If you need to do a pickup or if the screenwriter or director adds new lines, you can load the environment, move the set pieces back into place and shoot the scene. No permits or challenging reshoots are required, freeing up time for teams more on the creative aspects of production, rather than the logistics.”

He adds: “The next evolution of virtual production tools will help innovate the way narratives are driven and help us tell stories in more impactful ways.”