Global creative studios Final Pixel has successfully streamed live facial and body motion-capture to Unreal Engine.
The content, played through Disguise, used cluster rendering to create a high-quality bespoke 3D character using a traditional CG pipeline with “an extremely high level of detail”, said the company.
The team was able to create real-time interactions between the characters in-camera with no noticeable latency for the viewer.
According to Final Pixel, potential uses of this approach include:
- The virtual production pipeline for film, TV and advertising allows for live interactions between digital and human characters, all filmed in real-time and in-camera.
- Live-action mo-cap with creatures and characters which can then be replaced by full-scale CG in post – thereby capturing more ‘natural’ actor reactions and engagement verses use of green screen.
- To create increased fidelity augmented reality plates, in particular for live broadcasts and when using the enhanced stage management provided by Disguise.
Michael McKenna, CEO of Final Pixel, said: “As a company specialising in virtual production for film, TV and advertising, we are excited by the opportunities working in real-time game engines can provide for the creative process when everything can be captured in-camera while shooting live-action.
“The next evolution of this technology is to look at the elements which are still considered too heavy or complex to move out of the post production workflow. We are excited to share our findings with the rest of the industry to help us collectively move the use of virtual production forward.”