Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Panoramic interactive sports video action

The EU-funded FascinatE project, which will allow viewers to customise live coverage of sports by selecting different camera angles, will be shown at IBC.

A TV service that allows viewers to customise live coverage of sports by selecting different camera angles will be shown at IBC, writes Adrian Pennington. The EU-funded FascinatE project will demonstrate its latest advances using footage from last season’s Chelsea Vs Wolverhampton Premier League football match.

At the heart of the project is the Omnicam (pictured), developed at German research lab Fraunhofer HII, which uses six HD cameras on a circular rig and software that stitches the images together to form a 180º panorama. Using a remote control, users can navigate their way around the 6k x 2k resolution panoramic video.

FascinatE (Format-Agnostic SCript-based INterAcTive Experience) is a four year (2010-2013) €9.5m project backed by a consortium of 11 European organisations to develop a system for an ultra-high resolution television service. Members include Alcatel Lucent, Technicolor and the BBC.

“Sport was the first opportunity where we could really create some interesting content for testing,” explained research scientist on the project Omar Niamut. “If you want to create the idea of being at an event then having an overview of an arena is essential. It is not possible to capture everything with existing cameras, not matter how many populate an event. Immersive systems such as FascinatE capture everything and let the viewer direct their gaze where they want.”

One challenge is integrating different camera feeds and rendering them into one output stream, with BBC R&D taking the lead on this.

“The underlying concept we are adopting is that of format agnostic production,” explained BBC R&D’s Graham Thomas. “Until now we’ve had to produce an entire production in either SD, HD, 4×3, widescreen and so on, but why decide the format at the time of production – why not decide at the time of display when there are a myriad of possible different displays with screen sizes from mobile to immersive projection systems? We are attempting to combine images from broadcast cameras with zoom lenses and a panoramic camera so that all aspects of an event are covered, but in such a way that lets the viewer decide what they want to watch

“If a viewer wants a close up of action on the pitch we want a smooth electronic zoom starting wide from the Omnicam and moving seamlessly to the close-up with no jump in resolution or colour rendering between the cameras.”

The tests at Chelsea illustrated how tricky this could be to achieve. When the Omnicam was placed adjacent to a conventional camera, and coverage was attempted to be seamlessly exchanged between them, there was a noticeable parallax shift.

“We could see the ball from the HD camera but not in the image from the Omnicam simply because of its slightly different position [where] the ball was obscured by a player,” said Thomas.

Salford University, another project partner, is pouring its work in wavefield synthesis and 3D audio into the project, attempting to match the direction of the audio to changing video angles – so that when a user zooms into action in a soccer match when a ball is kicked, they hear the relevant portion of audio of the ball actually being kicked.

“You could have a channel that just focussed on one player on the pitch throughout the game,” using a virtual camera composed from the panoramic image and output from other cameras, added Niamut.

Another area of interest is in presenting content so that it automatically reframes itself to the display the viewer is watching on. Austria’s Joanneum Research will show how its image analysis algorithms automatically crop the panoramic video to fit large screens for projections in public, TVs and mobile devices. The software can also be used to create virtual camera angles. Alcatel Lucent and Dutch research outfit TNO meanwhile are exploring how networks with intelligent media processing components might adapt the content to suit different device types.

The final area of innovation is on the end user side. Here Technicolor has devised an interactive rendering technique to zoom into parts of the image. Researchers at the University of Barcelona have added gesture-based control using a Microsoft Kinect so that users can interact with the high-resolution field of view.

“If everyone interacts with the content directly from one server at the same time the system becomes unfeasible so a main challenge is scaling this up to multiple users,” explained Niamut. “We need to consider CDNs or a combination of broadcast and unicast.”

The next test shoot is likely to be of a concert or musical event, he added: “Anything that can benefit from high resolution and also offering people an overview and a personalised view.”

EU projects such as this are required to show a unifying ambition that ties together a number of technologies, but the individual research areas – gesture tracking, intelligent networks, the Omnicam – are more likely to be used in other applications before this idea gets beyond the experimental stage.

www.fascinate-project.eu