After making its name in the live events space creating immersive, 3D shows for the likes of U2 (that explains the glasses), disguise is poised to disrupt the broadcast tech industry with a new extended reality workflow known as xR.
Instead of a traditional green screen environment, xR uses camera tracking technology to capture data and renders it in real time using LED screen technology, thereby building a virtual environment in which a presenter can be totally immersed.
“That whole thing of weather forecasters not knowing where cities are goes away,” says Peter Kirkup, technical solutions manager at disguise. “The workflow here is a 3D simulation engine, so we’ve got a 3D model of the LED screen set up. And then we understand the tracking data that’s coming in, so we update our model to give it more realistic information to understand where the camera is, lens intrinsics and everything about that camera. And then in the process of doing that, we get enough information to render the scene from the perspective of the camera.”
The workflow is agnostic on the tracking system and content system (“Here we’re using stYpe but it could be Ncam or Motus,” notes Kirkup) meaning it can be deployed by any studio with a screen and tracking system.
“What we’re doing is packaging together the workflow as part of our software solution,” Kirkup explains. “And then we’re working with partners to deliver this, so we’ve got solution providers who are xR solution providers, they can deliver the end-to-end, including the LED screen, the processing, and all of the other bits and pieces that are needed to pull it all together. So although we’re the ones kind of pushing the software side of things, ultimately there would likely be a partner who’s decided to deliver this as a complete workflow.”
Read more in the latest edition of TVBEurope.