Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Facing the future of 4D scanning

Clear Angle Studios is leading a quiet revolution in the art of 4D scanning. CEO, Dominic Ridley, tells TVBEurope about the tech behind their remarkable progress

Last August, Clear Angle Studios announced it had achieved a breakthrough in 4D performance capture. Working in collaboration with performance-driven facial animators DI4D and texture enhancement specialists, TexturingXYZ, the company showcased a video featuring a photo-realistic digital representation of an actor’s performance that was virtually indistinguishable from actual footage of the human, marking a technological innovation that has far-reaching implications for the media and entertainment (M&E) industry.

Discussing both the technology behind the development and its significance to the M&E industry, Dominic Ridley, co-founder and CEO of Clear Angle Studios, begins with a brief background. The company is a global operation. Headquartered in London, Clear Angle Studios has offices in Atlanta  and Vancouver, along with Budapest and other locations worldwide. This autumn, a full service studio was opened in Culver City, Los Angeles, offering a permanent scanning facility to meet burgeoning demand. “We cover the film industry in terms of scanning,” says Ridley. “We’ll scan props, vehicles, sets, environments and characters. We’ll get in a helicopter and capture entire mountain ranges, but at the same time, we’ll scan this pen for you,” he explains, holding up the item in question. 

Then, alluding to the growing convergence that has been noted in the industry, in which shifting audience trends and technological innovation has seen media companies widening their IPs, Ridley says, “We’re in the scanning and digitising business, traditionally we cover film production more than anything else, but we’re looking to get more into gaming, as the divide narrows between photo realism in games and film.”

Turning to the subject of 4D scanning, he clarifies a common misconception. “It’s often called 4D mocap but it isn’t motion capture. It’s scanning,” he says, gently correcting the misnomer. Clear Angle Studios, first and foremost, is in the business of data capture. Explaining further, he adds, “4D scanning is essentially motion scanning. When you take a video and turn it into per-frame 3D scans, so you’re getting animation at the same time from all angles, that’s essentially what 4D scanning is: taking a static scan and making it an animated scan.”

In simple terms, the fourth D might refer to the movement aspect of the data, the animation itself. 

“It’s the movement, yeah, but as soon as you get into movement you end up with 24 times or 30 times, or 48 times as much data as you would from a 3D scan,” says Ridley, identifying what has until now been a significant barrier to the process – the immense requirement for computing power. “Traditionally, 4D scans were relatively low res, to try and make up for the fact that there was so much more data. It’s very processor heavy, because you’ve got to process 24 3D frames per second over a period of however many seconds or minutes that you’ve got. Usually, you’d have gone lower res. Instead of doing one high res scan, you’d do lots of smaller low res scans and that would drive your animation. Essentially, you’d drive the high res texture from the neutral pose across all the other poses.”

Referring to the August presentation, Ridley continues, “[The video that] we presented in collaboration with DI4D and TexturingXYZ does contain a certain portion of it. It’s how we enable the high res to happen for every single frame. That’s how it’s unique, quite different. It was a lot of work – a lot – but it gave us a stepping stone into the next possible, the next high res way of capturing and displaying data.”

Division of labour

Work on the project was divided between the three specialists, with each company providing its own expertise. Firstly, the subject, Clear Angle Studios’ finance director, Michael Pedersen, was scanned into the system. “We recorded him performing FACS shapes with our synchronised video camera system so that we could capture all the facial expression transitions from neutral to each of the FACS shapes at 24 frames per second,” says Ridley. “We processed that data to produce a per-frame sequence of OBJ files, which we delivered to DI4D. At this stage the OBJ data has a different random mesh topology per frame. DI4D wrapped the mesh topology from the base mesh accurately onto one frame of the OBJ sequence data. They then used their state-of-the-art mesh tracking solution to track this mesh topology accurately onto every other frame in the sequence. The result is a 4D sequence comprising a highly consistent mesh topology, which deforms as the face transitions from neutral to each of the FACS shapes. The reason that’s key is because at the end of the process, every frame of data has an identical UV layout.”

The data was then returned to Clear Angle Studios, where it was textured and sent on to TexturingXYZ, the third project partner. This final stage entailed analysis of the texture, normal maps and displacement maps, using TexturingXYZ’s bespoke solutions, Hyperskin 2.0 and HyperlookX, which enabled the data to be enhanced. “They have a really cool platform,” says Ridley. “It’s incredible tech and gives the data that little bit of extra cherry on top.”

Capturing the data requires the use of Dorothy, Clear Angle Studios’ proprietary system, which has developed through several iterations since its introduction in 2017. “Dorothy is now at a point where we can very accurately control the lighting and the cameras,” says Ridley. “We use high resolution machine vision cameras which are accurate to whatever millisecond range we want, and they perfectly sync every frame with our lights.”

A deep understanding of the hardware is a Clear Angle Studios speciality, coupled with a knowledge of how the data is going to be used in post production and a relentless drive for perfection in its captured data. “We focus predominantly on making sure we get the best, high resolution blank canvas data so that visual effects vendors can add the creativity, lighting, whatever it may be that they require to do for the shots that they have,” using what Ridley refers to as the best captured, most photo-realistic, true-to-life data it is possible to provide.

Alongside the machine vision cameras, very high resolution Sony DSLRs are used for stills capture. “They’ve got fantastic dynamic range” he says, “and we strike a balance between using all the resolution that we can on the face and depth of field, lighting limitations and all those kind of things.” The team is very aware of the need to maintain safe light levels for the subject being scanned. “We’ve got a lot of light in there, and we can only shine so much light directly into someone’s eyes before it becomes uncomfortable or unsafe.” The system is designed to adapt lighting levels with the camera’s actions, adjusting them up or down to match exposure. Simply turning the lights off results in an unwanted flickering effect. Finding the balance takes skill. “It’s a cliche but it’s more art than science. You’ve got to make sure that you know whoever’s sitting in there is comfortable, and you’ve got to work around that first and foremost.”

Given the prevalence of the technology elsewhere in the industry, it is perhaps surprising to learn that artificial intelligence is not used during the scanning stage of the process. Nor are there plans to introduce it in the future. Ridley says. “We don’t use AI in any of our processes. We just focus on recreating the real, to the highest quality possible.”

Merging the boundaries

Since the company was founded in 2013, Clear Angle Studios has worked with some of the biggest names in the industry and its enviable back catalogue includes productions such as The Fast and the Furious franchise, Game of Thrones, and How to Train Your Dragon. Recently, the team worked on Wicked and Gladiator 2. For Ridley, a standout moment came in 2013. “Cinderella was shot here and it was a great turning point,” he says, “They finally realised that our product, in terms of full body photogrammetry, was a better solution than what was there before.

Returning to the subject of media industry convergence, Clear Angle Studios was able to diversify into the gaming market throughout the much-mentioned quiet period for film and TV production. However, the team has witnessed a definite upturn in production projects, with the next few years looking to be a busy time. While remaining tight-lipped on exactly what they have been working on, Ridley says many of the studios have “a pretty full slate of productions that they’re looking to go and shoot in the next year or two.” He also hints at “something big” for Christmas 2027, but professional integrity sadly forbids him divulging anything further

Looking over the horizon, Ridley considers what Clear Angle Studios would like to achieve, should the technology advance sufficiently. “Teleportation,” he laughs, before lowering his sights somewhat and describing the possibility of creating immersive experiences in live sports coverage, referencing a recent Apple Vision Pro Formula 1 demonstration. “I think we’re all hungry for more content and different viewer experiences. For example, in Formula 1, we’re used to seeing information on the screen, but to actually see things in photo-realism from the track, almost like a hologram, where you can see how the cars are spaced out on track from a driver’s perspective, that’s so much more immersive.”

He also sees a bright future for Gaussian Splatting and radiance fields, “There’s some interesting tech that’s being explored. It doesn’t fit into a traditional visual effects pipeline at the moment, but it’s certainly something that can be leveraged in the world of 3D and filmmaking and I think we’ll see more of it in the future,” he concludes.

With the advances that the Clear Angle Studios team are making, it probably won’t be long before such immersive coverage is commonplace – and don’t be too surprised if they manage teleportation while they get there.