Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

How Freelens TV puts science centre-stage for its latest game show

Take Off is a game show where science and innovation meet competition. With 12 contestants pitted against one another in a series of challenges, culminating in a grand finale, we go behind the scenes on the formats production workflow

With a mission to ignite a passion for science among Luxembourg’s youth, the ground breaking reality TV programme is a first of its kind, showing that science is an integral part of our daily lives. Supported by the André Losch Foundation and the Luxembourg National Research Fund (FNR), the show is broadcast on RTL and streamed via YouTube. 

Freelens TV, a specialist in live broadcasting and storytelling are the creative and technical force behind the series. “Our 25 years of expertise in television isn’t just about producing content; it’s about crafting experiences. We leverage our deep understanding of production workflows to collaborate with major television networks worldwide,” begins Yann Figuet.

Producing a reality TV show without knowing what will happen or where posed a unique challenge for Figuet and the team at FreeLens. The solution lay in a multicamera workflow with Blackmagic Design at its core for acquisition, control and delivery.

“Comprising 19 cameras, we implemented a technical solution that included SMPTE fibre camera chains and a vision workflow based around Blackmagic’s ATEM Constellation 8K live production switcher and ATEM 2 M/E Advanced Panel,” reveals Figuet. 

For picture acquisition, FreeLens TV employed a mix of URSA Mini Pro 4.6K G2 and Blackmagic Micro Studio Camera 4K cameras augmented by additional PTZ cameras.

“Broadcast in HD, we paired the URSA Minis with a mix of B4 lenses and rigged each unit with a fibre converter, providing a camera return feed for the operator as well as camera control, talkback and tally. The Micro Studio Camera’s size made it ideal for tight spaces, offering a discrete viewpoint,” recounts Figuet. 

He praises the URSA Mini Pro 4.6K G2 for its versatility. “Whether it’s a talk show, sports event or fashion show, it offers a cinematic shallow depth of field and a superior image quality, making it an ideal choice for both web and broadcast delivery platforms.”

Remote colour shading on the production was enabled through multiple ATEM Camera Control panels, ensuring all live camera feeds could be matched live to reduce the turnaround time in post production.

Mixed Reality 

A central part of the studio set was a virtual production space, blending assets created in Unreal Engine, which enriched the show with supplementary videos that broke down and explained scientific concepts. 

“Our goal was to both educate and engage,” explains Figuet. “The host, set against a virtual backdrop, demystifies complex scientific ideas with visual assets that we created in Unreal Engine. This immersive approach not only helped to deepen the viewers’ understanding but also makes the learning more fun.”

The virtual set leveraged Unreal Engine, alongside a green screen setup with two main components for capturing live action according to Figuet. “We had a motion controlled robotic arm equipped with an URSA Mini and a PTZ using the FreeD sync box and app for tracking. 

He adds, “Information from the tracked arm, including data from the URSA Mini and lens, as well as data from the PTZ were sent to two Pixotope engines which then integrated the live shots of both cameras, wide and close up, into our virtual environment.”

“Depending on the complexity of the virtual content, we would either feed it directly into the vision mix workflow or, for deeper scientific explanations, record the camera’s tracking data and the footage against a green screen, with the rest being added in post production for greater flexibility,” says Figuet.

Streamlined production and delivery 

Managing media was a significant challenge. The production workflow employed multiple HyperDeck Studio 4K Pro broadcast decks to record each of the 19 camera ISOs, the ATEM’s programme output and additional sources, including tablets, computers and infrared cameras. 

“We’d shoot an episode in a day, with a typical day resulting in around four terabytes of data, and every single source we acquired had to be backed up simultaneously, effectively doubling the amount of data,” explains Figuet. 

“To distribute the different streams, a pair of Smart Videohub 40×40 12G video routers were used for signal management. One managed inputs and outputs on our ATEM Constellation 8K, with an additional router distributing signals from the 43 broadcast decks,” he notes.  

To help FreeLens follow each episode, a live programme was vision mixed, which was then re-edited in post production. “Utilising the ATEM Constellation’s four built in multiviewers allowed us to configure different multiview outputs to support both live direction of the show and remote colour shading without any additional hardware,” says Figuet.

Contestants also used tablets to insert results live during the game. Not only was this sent to the control gallery, but it was also fed through to repeater screens around the studio itself. That ensured both the jury and the contestants could consult the results live.”

The gameshows workflow also incorporated GPI (General Purpose Interface) triggers, integrating the contestants’ buzzers with sound and lighting controls, all managed using a Smart Videohub 20×20 and the atemOSC app. 

Post production, including the multicam edit and colour correction was delivered in DaVinci Resolve Studio. “With all source material, including video and multitrack audio, timecode synced using the ATEM Constellation’s internal timecode generator, each show could be reconstructed as a multicam edit timeline and finished, with the live cut providing a reference,” concludes Figuet.