Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

HBS puts 3D to the test

The FIFA World Cup promises to be the most high profile global 3D event since Avatar. That’s a lot of pressure on HBS which had less than five months to prepare everything including beta testing the use of Sony’s Processor Box. Adrian Pennington tells the story.

The FIFA World Cup promises to be the most high profile global 3D event since Avatar. That’s a lot of pressure on HBS which had less than five months to prepare everything including beta testing the use of Sony’s Processor Box. Adrian Pennington tells the story.

Given the meteoric surge in interest in 3D, FIFA’s official announcement last December to cover 25 matches of the 2010 World Cup in the format came as no surprise. Yet the decision was made rather late in the day giving its production team at HBS less than five months to prepare for what promised to be the most high profile global 3D event since Avatar.

Arguably FIFA TV would not have attempted to achieve anything as ambitious had it not had the expertise of HBS at its disposal. Over successive World Cups since 2002 the Zug-based outfit has secured a reputation for being as precise and meticulous in its planning as a Swiss watch.

According to Francis Tellier, HBS chief executive, “Even though we had no certainty 3D would be required we had anticipated it. The most important thing was that we were 100% prepared for our HD production and without everything in place for that there is no way we could have contemplated adding something on top.”

HBS Director of Production & Programming Peter Angell assumed the role of FIFA special 3D project leader. He contracted London-based 3D production specialist Can Communicate to provide further 3D technical expertise. Together they devised a test programme for the technology and workflow based around HBS’ ongoing 2D production of matches for Ligue 1.

“We’d scoped various equipment to produce this feed probably as long as a year before the announcement,” reveals Duncan Humphreys, Can’s 3D consultant. “In particular we began early conversations with Sony which had begun to develop its 3D Processor box.”

With Sony one of the major sponsors of the FIFA World Cup and also the biggest funder for the 3D project, the manufacturer was naturally keen to have its technology at the core. Specifically this meant a digital processor-driven technology intended to replace the mechanical alignment performed by motorised rigs on camera lens pairs.

“Sony were developing a solution for 3D live OB production and wanted strongly for the Processor to be the chosen equipment — but there were caveats in place,” explains Humphreys. “A rigorous series of tests were required to see if it could it actually match the performance of US-based mechanical systems. 3ality have set the bar high but it’s always been my personal desire to be able to do live 3D using relatively off-the-shelf products.”

The initial stage was “proof of life” says Humphreys of a match in Grenoble shot with one Element Technica Quasar rig, one Sony box, a set of Canon HJ lenses and one upgraded zoom controller “to prove whether the chain actually works.”

Delays obtaining equipment meant this first test took place in February, condensing the test timeframe even further. An additional seven tests were arranged up to the beginning of May to refine technology and editorial.

“We’ve probably done as much 3D testing as anybody in those few weeks,” explains Angell. “Essentially we were beta testing the Sony 3D Box, working with lab versions. Sony were incredibly reactive to the observations we made and we learned together what functionality an image processor in 3D needs.”

The biggest concern was whether Sony, along with Canon, could devise a full zooming solution that would feed metadata from the lenses to the processor box (MPE200) and maintain the calibration of both lenses during zoom. This was achieved to HBS’ satisfaction by mid-March.

“The MPE200 3D Box is a game changer,” declares Angell. “Only now is there an alternative to US-based mechanically corrected systems. Now we have a much more straightforward tool which can be adjusted to suit a wider range of broadcast equipment. In the longer run technologies like this will help bring 3D production costs down.”

Workflow
With that crucial aspect in place attention turned to workflow. Each camera pair is assigned a convergence puller responsible for alignment, set up of camera and rig, and pulling convergence live.

“You have to treat the convergence puller as part of the workflow,” says Humphreys. “Traditionally an OB camera-op turns up, goes to his camera, checks everything and is ready to go. But in 3D there needs to be more of relationship between director/stereographer and convergence puller and what his camera is doing technically.”

Angell also views the role of the convergence puller as a craft. “They must intuitively follow objects, like a football, out of the screen and make a decision to pull convergence in a split second. It’s a bit like a camera-op reacting to an event but the key thing is understanding the impact of what the convergence will achieve. It’s not an engineering position. While some people argue that convergence pullers should be replaced by automated convergence technology for financial reasons, I’m not convinced that’s the right way forward.”

One production unit from Telegenic and another AMP Visual TV have been air-freighted to South Africa following outfitting with positions for eight convergence pullers. These will fall under the command of stereographers Peter Howard and Richard Hingley with match directors Bruno Hullin and Jean-Charles van Kerkoven, both regulars on the Ligue 1 circuit.

Constraints of space within the stadia have limited the number of positions available for placing the rigs. The plan includes main camera wide and main camera tight — traditional camera 1&2 angles filming slightly different frames than they would in 2D but from a much lower position in the stands. The others are goal line left/right on pretty much the same height as the main two cameras looking diagonally across the pitch; bench left/right (low angle either side of the technical area on the pitch; and behind the goal left/right (on the six yard line to the side of the goal).

Devising 3D editorial
“Ninety-nine percent of people will see the World Cup in 2D HD so we can’t do anything to risk that coverage,” stresses Angell. “So long as 3D is a premium event proposition it will face issues in the short term at any stadium which is already full of cameras and paid seating. If we were starting from scratch it would be straightforward but the 3D element adds another eight positions to the 32 per World Cup venue already dedicated to the 2D host coverage so it is difficult to get space.”

One rule to have emerged about 3D outside broadcasts is that coverage requires fewer cameras than 2D, with cut-aways and replays not as necessary to tell the story. Nonetheless no speciality cameras have been included in the 3D mix for the World Cup with HBS looking to convert occasional 2D shots from the armoury of its other cameras to augment the 3D.

“Inevitably in this development phase of 3D there will be a need to include some 2D,” says Angell. “The technical challenge is finding the right way to do that. For example, if there’s a particular incident (such as the Zinedine Zidane head-butt in the 2006 World Cup Final) which has only been captured on a 2D camera, or a 2D camera has the best angle, then editorially that shot is critical to the story and we would be penalising the viewer if that weren’t included.”
Angell notes that some European broadcasters have cut 2D coverage into 3D OBs “which works for a short duration where the value of the shot itself is high enough.

“We have to be judicious about it,” he insists. “The goal is to tell the story as well as possible but that doesn’t mean littering the coverage with 2D shots. Ideally we need a means of cross conversion that retains enough of the 3D image so that it makes sense in the story we tell.”

At the time of writing HBS is still to determine exactly how the 2D-3D coversion will happen, with a number of potential technology solutions still under investigation, including using the video effects function of the Sony vision mixer to create a ‘pseudo-3D’ image from 2D camera.
Quality is paramount in HBS’ editorial conception. “We want to get 3D right and not scare people away from it. None of us who are pioneering live 3D want to create a bad 3D experience or to trade editorial production values against 3D values,” says Angell.

“We experimented with some crazy stuff to see what would work editorially including covering the whole match from a position behind the goal, or making a corner position the master shot. My feeling is that 3D will be viewed in parallel with very, very high production value 2D coverage so we don’t want to do anything too drastic that doesn’t complement the 2D package.

“In fact what we have found is that the classical way of covering football in 2D is not actually a bad base for 3D. Sky Sports came to the same conclusion, which is to modify only slightly a classical camera plan.

“Once we get more settled with 3D and the audience gets more familiar with it, then it’s time to try change the game a bit and exploit 3D’s potential in new ways.”

Angell and Humphreys concluded that a conservative approach to 3D would be the best option, which meant devising a depth budget that wouldn’t jar the audience’s perception.

“We needed to decide exactly how to manage the depth budget and also how we decide to break it,” explains Humphreys. “Having it fixed at the beginning of production is fine but it’s important that you know how you can break that budget for effect and when it makes sense to do so.”

Footballs randomly booted into the stands and toward a 3D camera would make an obviously stunning 3D shot but decisions need taking about what the outer limits of the convergence should be.

They settled broadly on a depth budget of 2-2.5% positive parallax (into the screen) and .5-1% negative (out of the screen) which is a touch more cautious than Sky’s depth settings but then Sky Sports isn’t also broadcasting to large screens.

The FIFA World Cup 3D coverage will be delivered to cinema screens and other large screen venues. Recording is on HDCAM SR dual stream VTR (SRW5800) on-site and at the IBC, as well as to an EVS server at the IBC in Johannesburg for long term archive. Discreet left and right eye channels of 1080i50 HDSDI are sent back to the IBC over a JPEG2000 contribution network compressed to 300Mbps. From the IBC it is the responsibility of broadcasters such as EPSN and Spain’s Sogecable to manage the feed.

Graphic considerations
The position and size of graphical elements needed consideration for both types of viewing environment. “Graphics can contribute to the overall 3D impact but if you play with them too much they become enormously distracting,” says Humphreys. “We initially put the clock and score as far into the screen corner as possible only to find it was more a problem in the cinema than it was for TV.”

HBS worked with FIFA TV’s graphics supplier Delta-Tre to devise a means of positioning the graphics in realtime on the Z axis depending on the shot selection.

“We are working out a set of values for where the graphics are best positioned on screen to give maximum effect without being completely overpowering,” says Angell. “The graphics will generally sit just in front of the screen plane, but if a player runs toward camera we have the possibility of shifting it so we don’t end up with a situation where the graphic appears in front of the action when in 3D terms is should be behind. It’s a subtle trick to pull off.
“We are acutely aware of the enormous responsibility we have because a lot of people will see 3D through the World Cup for the first time. We want them to walk away from the experience feeling satisfied. We also need to manage expectations. You cannot compare Hollywood movies or games to live sport. With the best will in the world live sport is not going to be <I>Avatar<P>. Its producers had years to work out the 3D effects — we have milliseconds.”