Does Cameron-Pace’s 3D vision add up?4 May 2011
The single 2D and 3D production for sports envisioned by rig innovator Vince Pace and director James Cameron has been queried by a key member of the 3D production team behind the FIFA World Cup 2014, writes Adrian Pennington.
At NAB Pace and Cameron suggested that with an approach to outside broadcast that shared the same camera positions, a 2D feed could be derived from a 3D production within a couple of years. They also suggested that new automated technologies would remove the need for separate and costly convergence operators.
“It reads well as a headline but if their [Cameron Pace] intention is to say that 3D production is as easy as 2D then I don’t understand it,” says CAN Communicate’s Duncan Humphreys, part of the technical team advising HBS on the 3D broadcast of the World Cup in Brazil. “I can envisage a joint 2D and 3D technical production but different sports require a different editorial approach. One size does not fit all and for most sports a separate 3D cut is necessary.”
Few would argue that to bring 3D costs down a greater integration of technology and crew between 2D and 3D is necessary, but Pace and Cameron seemed to advocate the use of largely the same editorial feed from camera positions selected because of their relevance to 2D storytelling.
“A separate crew and finding separate positions is just not going to work, especially when you start targeting positions that are in higher demand, like a World Series or a Super Bowl,” Pace said.
Pace’s Shadow system (employed at the recent US Masters Golf) piggybacks onto 2D cameras because, he says, most of the time, the point of interest is the same; “Whatever 2D is looking at, 3D loves as well. The camera angles are very complementary to one another.”
However, Humphreys believes that working 2D and 3D as a simultaneous editorial may hold for feature films but not for live sport, “With its slower-pace I can see golf working that way but motor sport definitely needs to be covered and cut differently. Football won’t work either because the 2D HD cut is faster, with more close-ups and more jumps between shots. As a rule the camera positions are different in 3D. You want lower positions with more, slower pans across the screen.”
Cameron and Pace’s position is broadly shared by their chief competitor, Steve Schklair, CEO of 3Ality Digital, who believes that for 3DTV costs to reduce a way must be found of accommodating 2D and 3D technology and direction.“Everyone talks about the creative differences between 2D and 3D as being a barrier to simultaneous productions but I’ve never seen a 2D cut of anything shot in 3D that didn’t work,” Schklair says. “The audience has been trained by the broadcaster over a decade to accept more cameras and angles. So it’s about training the audience back the other way. With some adjustment the 3D would be perfectly viewable as 2D. Yes there would be some compromise on the 2D and on the 3D editorial but TV is a business of compromise. It’s great to be a purist but the economic reality means it won’t work. You can be a purist right up till it’s not a business.”
While the Cameron-Pace Group refines its new smart camera system which it claims will solve the 2D-3D conundrum and possibly deliver a reframed and re-adjusted 3D feed, 3Ality’s software for automated convergence and lens alignment is already being tested at Sky Sports.
Both parties aim to remove the need for separate convergence operators but Humphreys challenges this assumption.“If you turn up with ten 3Ality or Pace rigs plus software what happens then?” Humphreys asks. “Normally you’d have a trained technician who may also act as a convergence puller to make sure the software is actually set up correctly and will run smoothly, or to trouble-shoot problems.
“I can see software automating certain positions but I think positions on the touchline or closer to the action need a trained eye to pull convergence because too many things could go awry if the production is software reliant.”He agrees that the number of convergence ops can be reduced by as much as half, today.
“In South Africa we shot the World Cup using eight camera rigs and eight operators but it’s likely when we come to film the next World Cup that we’ll take half the number of operators. I would have all the top, wider cameras automated – that is easily done. But I believe the touchline cameras still have to be converged.”
At NAB Cameron also attacked stereographers and other ‘3D experts’ for getting in the way of the creative decision making. He said that the signal which they send out ‘is that 3D is a problem and that I as a producer can’t do my job unless I talk to someone like you.’
Humphreys however is adamant that stereographers remain crucial; “You don’t take on a major CGI project without going to get advice from major CGI houses. You don’t take on a major feature without employing a cameraman who can deliver a look and feel suitable for the project. Employing a stereographer is no different.
“There certainly are a number of charlatans out there,” he adds. “In particular those who jump out of post production and claim to be 3D experts with next to no camera experience. But for the most part stereographers are there to provide invaluable advice as to what shots work and what shots won’t leaving the creative choice up to the director.”
The 2010 World Cup was shot using Element Technica rigs augmented with Sony processing software. Since Sony is one of six main FIFA ‘partners’, paying U$305m from 2007-2014 for the privilege, its technology rather than that of 3Ality or Pace can be expected to play a significant part of the 3D production in Brazil.