The production team responsible for the 3D production of the World Cup have taken stock of the technical and editorial lessons learned during the intense production schedule which saw 25 matches shot in 3D in four weeks, writes Adrian Pennington.
According to Duncan Humphreys, Creative Director of CAN Communicate who was technical consultant to production team HBS for the production “everything went just about as well as we’d hoped.”
Sony’s Head of Sports Business Mark Grinyer says, “We proved our technology was robust to deliver high quality 3D in a real sports schedule which we always saw as a challenge. From day one we didn’t have a workflow process. That got more and more refined so now have a workflow which we will continue to refine and make more automatic.”
Before arrival in South Africa and during tests in France the production team had to solve one major technical glitch. The fault, which took a month to identify and fix, turned out to be an issue with the Sony HDC1500’s power supply.
“The cameras were working just fine until we went into live trials and the system was zooming strangely and at one point failed altogether,” says Humphreys. “We realised that since we were powering the Element Technica rigs including all zooming and alternation -- and communications traffic -- there wasn’t enough power in the cameras to be able to do what we needed.
“The reason we found it hard to isolate was that the problem coincided with the time we switched control of the rig from the camera to within the truck. That control was a huge benefit to the production since it means we didn’t have to have two operators at every camera position and all convergence could be performed in a small area where people can easily talk to each other.”
The problem was resolved by attaching an external power supply to the rigs. “In stadia or other venues with power sources there’s no problem but in certain locations it could be -- so we will look further into it,” says Grinyer.
On arrival in RSA two concerns occupied the team’s mind: that two crew units (AMP Visual TV in Johannesburg and Telegenic in Durban) would produce the same 3D editorial rather than two separate productions and that set-up time between venues could be drastically reduced.
“There was a bit of a settling in period but after the first few games both units were producing a similar 3D presentation,” says Humphreys. “The biggest concern was the turnaround for crews. The Johannesburg team in particular would shoot in Soccer City then pack up and drive to Ellis Park ready for the next match.
Sometimes that meant a de-rig at 11pm for a one o’clock kick off the following day. This intensity of rigging and de-rigging over a protracted schedule had never been done before. For the first couple of weeks the guys were under immense pressure but eventually we got it down to under five hours and it went really smoothly.”
Heat and cold played its part in the functioning of equipment. “We began the Brazil v Korea match on a beautifully warm day then the temperature plummeted to –5 so all the moving parts contracted. The camera was lined up for shooting the pitch but because of temperature change you’d frame up on the crowd and there were slight vertical misaligns.”
Sony’s main preoccupation was firstly to ensure its 3D Processor worked and secondly to learn how to build on the technology for future iterations.
“One of the things we are sensitive to is that when companies buy into the hardware power of the MPE-200 processor they need to look at return on investment around 3D,” explains Grinyer. “We’re looking at what we can do around the hardware platform (based on the Cell engine which drives the PS3) and 3D conversion is one of those ideas. For outside broadcasters the flexibility of a production tool is also key. We want to automate as many 3D processes as possible with this platform.”
Aside from 2D to 3D conversion future versions of the MPE-200, or rather its upgraded software, are likely to include finessed colour correction, QC for 3D on ingest into edit suites or before transmission and improved configuration tools between the lenses, rigs and processor box.
JVC’s IF-2D3D1 video image processor was used to convert 2D images shot from helicopter, cranes, Spidercams and some pitchside steadicams for the production. “All of these I think will be 3D camera positions by the time of the next World Cup under the direction of the 2D director since the shots are pretty much the same for both formats,” suggests Humphreys.
The MPE-200 also tended to misread the floodlight information as 3D errors, with each lenses apparently reading the light a different way. This will likely be fixed with a luminance key in later versions.
By all accounts the system itself worked almost without hitch during the 25-match 3D run. “I always believed eventually that 3D correction would be done digitally rather than mechanically and what Sony has achieved in such a short period of time is unbelievable,” says Humphreys.
“In fact it’s a game-changer -- producing quality live 3D with standard broadcast lenses and cameras. It’s become a bit of a maxim that what will kill off 3D is bad 3D and this is right, but what will also kill off 3D is expensive 3D because broadcasters are not going to pay the kind of premiums that are being required at the very top end. Sony’s 3D kit is beneficial for broadcasters since once they’ve made the initial investment it isn’t an expensive build per rig.”
That said it’s not like you can show up with a 3D box, turn it on and produce great 3D. “The crew still needs a solid knowledge about lenses, cameras, rigs, boxes, how it all works together and what’s coming out the other end. The crew who worked in SA developed a unique skillset in that they now know how to set rigs and make all the necessary tweaks with confidence.”
Because it was limited to just eight 3D camera positions per game, the 3D production threw up some interesting editorial findings. Chief among these was feedback from cinema screenings suggesting that audiences appreciated the immersive experience of a game directed at a much slower pace than normal.
“The 3D production allowed the viewer far more time to explore the picture, with fewer cuts, slower pans, and no slow-motion replays, rather than having the director and the technology do it for them,” reports Sony’s Marketing Director David Bush. “There’s an argument that viewers of 2D coverage would also benefit from a less frenetic production.”
Adds Humphreys: “You can give an audience a fantastic 3D experience but unless they know the score then there’s no point. You have to use the main camera wide to tell the story. It’s not a core 3D shot but you have to use it to balance the 3D with storytelling.”
The directors and camera-ops began to anticipate where the ball was going to land. “We cut quite a bit earlier to the player who was going to receive the ball so you can see what he was doing with his feet and body and the skill he has in receiving the ball,” says Humphreys.
The pitchside and behind the goal cameras tend to be the ones that delivered the 3D punch – where negative parallax would propel players or ball out to the audience. Stand out shots included one at the end of the Italy v Slovakia match when a Slovakian player ran right past a 3D position arms outraised; a similar moment occurred when Brazilian player Kaka dived forward and toward the screen in a celebration again in front of a 3D position; while in the first game the South African opening goal was hammered straight toward a 3D goal mouth camera.
Is it perhaps worthwhile asking players to choreograph their celebrations in front of select 3D camera positions in future? “We did gently ask that of the teams this time around,” he reveals. “The Brazilian team went toward a camera they knew was capturing action for the domestic audience (because of Brazilian flags held by the camera-op) and we got lucky because our position was adjacent.”