3D Post Analysis – Is the post business prepared for a 3D explosion? Carolyn Giardina reports.
Stereo-supported TVs, Blu-Ray players and even handheld devices have dominated 3D-related headlines since January. Yet the promise of 3D to the home brings with it new and varied hurdles for the post community through live programming, a range of technical specs, and the need to manage heaps of data with tools and workflows that are still evolving.
“Most of the technical problems are solvable, we just didn’t know that we needed to solve them this quickly,” says Buzz Hays, senior VP at Sony 3D Technology Center, a newly launched R&D and training centre based on the Sony Pictures studio lot in Southern California. “Training is also very important. We need to get going quickly because the 3D genre is exploding in front of us and people need to understand it so they can make good product.”
Inconsistent workflows and tools in this still new process, along with still undetermined delivery standards, contribute to some of the uncertainty. “So far there are no two workflows that are the same,” admits Devin Sterling, executive producer at Ascent Media Group’s grading facility Company 3 in LA. “Everyone is doing it a little differently each time.”
Whatever the approach, producers be warned: 3D post processes could add time, and therefore expense, to budgets. “You are going to spend more money in 3D post, whether it is 2D to 3D conversion or if you shot it 3D,” says Sterling. “You are going to spend more time in editorial and colour correction. It is not twice the amount of time. Right now safely I’d say it is time and a half.”
Depending on whom you talk to, the post technology wish list includes more advanced capabilities in tools from editorial systems to conforming platforms.
“A lot of the gaps in post production have been filled in the last 6-9 months,” says Peter Armstrong, manager of DI and mastering for Deluxe Toronto, which is building a 3D pipeline. “When it comes to live action, there is the challenge of getting the left eyes and right eyes balanced — colour balanced. They do a lot of that correction on set. In post we see the little anomalies, and we worry about getting that last 5-10%.”
There are also challenges such as how to handle lens flares, which occur during photography; or ‘windowing,’ during colour correction, where an automatic left eye to right eye translation doesn’t work. In the case of lens flares, Hays explains: “A lot of that has to be tweaked in post. There are a few inherent problems with beam splitter rigs that are not easy to fix. A lot of DI tools are equipped to accommodate things like lens flares. You can’t eliminate them but you can certainly minimise them. Artists are involved; it is still manual work.”
3D post starts and ends with the need to manage and move a daunting amount of data, at least twice as much as would be required for 2D, notes Marco Bario, VP, theatrical post production at Technicolor. “It is going to tax us as we do more concurrent 3D projects,” he explains. “We are used to doing a lot of concurrent 2D projects for film and broadcast. As we start to get concurrent 3D projects, we are going to have to increase our storage. So is everyone else. That will be a financial burden; there will be more data to render.”
Bruno Munger, Digital Vision’s product manager for grading and finishing, suggests that a standard method of naming and managing metadata and stereo files is sorely needed. “It would really simplify production and post if we had standards that would encapsulate both eyes into one file.”
New metadata standard
“It is very challenging on the metadata side to come up with a workflow that everyone understands,” Munger explains, saying that each facility has its own process in place. “There haven’t been any standards even just to identify files.
“Now you have two sets of media that need to be carried throughout the post production pipeline. Everything that we have been doing over the years with files — things like Quicktime or DPX — operates under the assumption that one piece of media equals one delivery. So there are no metadata containers that exist as a standard that encapsulates the left eye and right eye into one piece of media.”
Armstrong meanwhile sees an industry-wide need for CRT-quality colour reference monitors that are 3D capable. “There are acceptable LCD 3D monitors for viewing. For mastering and colour grading, not so much, yet. We can do mastering and grading with DLP projection, but I think there is going to be a need for a colour reference monitor that is 3D capable — that would be ideal for television.
“That is a pretty tall order,” he added. “It can’t be a CRT. That is going to be a tough one for the manufacturers. Probably every manufacturer is working on it. They are trying very hard to offer an LCD that is as good as a CRT. They are all pretty close, but most of our engineering departments are still having a hard time swallowing (the notion of) an LCD monitor that is as good as a CRT.”
The industry is also grappling with some basic questions. Among them: is a separate master needed for movies that are heading to the small screen, as the screen size and viewing distance are significantly different than those for theatre auditoriums.
Jim Houston, VP, technology and engineering at Colorworks, Sony Pictures’ newly launched post facility that offers a 3D mastering suite for feature and TV, suggests that this approach should be viewed as a creative option. “Any 3D movie that is transferred correctly will look fine on a TV,” he said. “But there is the possibility that you will want to enhance the parallax for the home viewer with a smaller display.”
Meanwhile, 2D to 3D conversion has been evolving as a post production process. These are offered by specialty businesses such as In-Three and, increasingly, as a service at post houses. JVC has developed a box that automates the process which is expected to ship after NAB.
Such processes have already been applied to feature work such as Tim Burton’s <I>Alice in Wonderland<P>, and as the TV market gets started, 3D content will be urgently needed. If the economics make sense, many expect that conversion will become another 3D post option.
“I think 2D to 3D conversion is more than acceptable,” Sterling suggests. “There are less restrictions in a 2D shoot. (Some) filmmakers don’t want to shoot 3D. They are looking to conversion.”
Hayes believes that there is room in the mix for additional technologies that would allow for 3D in post, though not as a typical conversion process. “This would be a methodology kind of like high-end visual effects where you are gathering information on the set and fine-tuning 3D as a post process,” he relates. “I think that is where things need to head. TV production moves at faster rate than features. You don’t want to slow down a shoot for 3D. It makes sense to eliminate bottlenecks. We are seeing this being developed. Various folks are tackling that issue.”