Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

The impact real-time technology is having on VFX

Martin Izzard, head of media and entertainment at Red Lorry Yellow Lorry takes a look at the rise of virtual production

The last few weeks have seen both the Real Time Conference and FMX take place; albeit virtually. The first is a technology-focused event that covers a number of different industries (VFX included) and the latter a VFX-specific show with panels and sessions that are increasingly being dominated by talk about the deployment of real-time technology. That is, engine technology that can render digital assets, whether it’s environments, characters or animation in…. well, real time. 

Talk to any VFX studio or supervisor about the biggest cinematic or episodic projects from the last few years and you’ll find the use of real time technology. It’s also increasingly being deployed for smaller productions and has long been the basis for mixed reality experiences which arguably set the scene for some of the techniques that are now becoming standard in the real time-enabled VFX toolkit. 

I spoke to some of the experts within technology vendors, VFX studios and service providers about the impact that real time is having on today’s VFX industry. Like any emerging technology, it’s in a constant state of change and development, but one thing that real time has quickly become synonymous with is virtual production, or VP. 

The rise of virtual production

Ever since Jon Favreau and the ILM StageCraft team tackled the latest trip into a galaxy far, far away with a lot of LEDs and cameras that respond to digital environments in real-time, filmmakers everywhere have wanted to use this so-called virtual production on their next project. 

But Ben Lumsden from Epic Games, the company behind the Unreal Engine, says that “it’s important to make a distinction between in-camera visual effects, and other forms of virtual production.” Thanks to its background, he says “Unreal Engine has been set up to do all sorts of forms of virtual production for about eight years, but it just keeps getting better and better at accommodating film, TV, and animation workflows.” 

As everyone is now realising, there’s much more that VP can offer other than just this particular technique. Pascal Achermann is the CEO & technical director at VRFX Realtime Studio, a Swiss studio which uses the Unity engine to create mixed reality content and deliver virtual production services. According to his definition of VP: “any time you are using real-time to support your animation, film or commercial production or storytelling, you are producing virtually.”

That means, depending on how you deploy real-time technology, the impact it has and the benefit it provides to your project will differ. Ed Thomas, the head of real-time and virtual production for volumetric video capture studio Dimension, says that VP allows “filmmakers to create living, breathing worlds at much earlier stages in development than has previously been possible.” 

He adds that it means “each filmmaking discipline has access to key decision-making moments much earlier in the process and in a more collaborative environment [which] democratises the creation of this content.”

This interesting prospect that technology innovation doesn’t just benefit the filmmakers but everyone around them shows how widely real time technology can be deployed. This might be a production designer who can test colours on the wall of a set without needing to get out a paint brush, or a lighting gaffer who can see different lighting configurations at the touch of a button. It also means that actors can now better visualise a world or a digital character that they otherwise wouldn’t have seen until the premiere. 

The Bourne Stuntacular

In recent times, VFX studios’ worlds have expanded far beyond film and TV screens to include attractions and immersive experiences where real time is really widely used. For example, Cinesite recently won a VES award for its work on The Bourne Stuntacular live show at Universal Studios, Hollywood, which takes place in front of a 130-foot-long LED screen. Unreal Engine was used to render 3D environments throughout production, providing real time feedback and the flexibility to adjust timings and layouts on-site. Without a real time engine in the mix, Cinesite’s Salvador Zalvidea said the project would have been a painful and frustrating experience. 

Benefits and challenges

What real-time technology allows is for creatives to experiment with ideas and make decisions early on that would otherwise take weeks to visualise with traditional VFX rendering. On The Bourne Stuntacular project for example, the use of real time for previs meant the team could use a headset to review their work as soon as it was completed in Unreal. 

Behind the scenes on The Bourne Stuntacular

But there are also benefits for the people watching the pennies. NVIZ’s Janek Lender says using real-time scenes for pre-production workflows allows for a more efficient flow of assets throughout the VFX process, which ultimately helps the bottom line. “Assets can be improved from previs, and upgraded to more developed assets for postvis, all within the same scenes, saving time and money.”

VRFX’s Achermann agrees, saying that the ability to render in real-time is amazing, but the ability to take real-time simulations from a set, record them and then send them further down the pipeline is equally as beneficial. This saves resources in the post-production that will still need to be completed. 

Of course, the wider roll-out of real-time technology doesn’t come without its challenges. Thomas and Lumsden both believe that education and training are important factors, both for filmmakers and for people who have long worked in traditional VFX workflows. Lender adds to this, saying that the key challenges for the roll-out of real time technology are largely associated with the workflows of traditional VFX. “The way of working and the mindset of traditional VFX pipelines has been challenged by new real-time workflows” he says, something that “the industry is currently wrestling with.”

Another important factor is the talent pool and the diversity of the people behind the pixels. The need for expertise in real-time technology and the improved accessibility to training courses means there are more routes into VFX than ever before. Cinesite’s Zalvidea, who’s also a mentor for AccessVFX, said that encouraging new talent is key to the future of the industry. Being a part of the AccessVFX mentorship programme means VFX generalists and real time specialists can help to encourage upcoming artists not only to learn the traditional processes, but also understand the future of VFX. 

Looking forward

Talking about what’s to come is always the question being asked at events like the RealTime Conference and FMX. Epic’s Lumsden sees a time when the creation of film and TV sequences will be gamified to really engage with filmmakers’ creative instincts. He says real time technology “will have a profound impact on peoples’ creativity and the way they tell stories.” 

For example, “if you want to shoot a car chase, you can actually build a drivable car and the director or cinematographer can play it with a game controller to create an action sequence. Then they can shoot with a virtual camera on top of the scene they’ve created.” 

It’s clear that while real-time has become key for many projects, its full potential has yet to be reached. Thomas puts it well when he says that as part of the VFX industry, real time technology will “no longer be something that’s new and emerging, but instead will become just another tool that we use to tell our stories. It will be even more embedded into the mainstream VFX pipelines. In fact, it will replace large areas of that process with these new tools and methodologies.”