Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


360º video tested at Roland-Garros

LiveSphere, the 360º broadcast video application under development in France, has been tested during the Roland-Garros tennis tournament.

LiveSphere, the 360º broadcast video application under development in France, has been tested during the Roland-Garros tennis tournament.

While at least a year from being commercialised as a realtime solution, its developers are using tests such as these as a springboard for use in highlights or on-demand packages and say they’ve had encouraging feedback from sports franchises – including basketball.

“The part that attracts most attention is how the picture stitching is done, but the part that enables us to build a business out of this technology is the efficient means of compressing the entire video sphere and the ability to match it to the best decoding capability of the smartphone or tablet to render the best results,” explained Benoit Fouchard, chief strategy officer, ATEME.

The main focus is on second screens where viewers might eventually be able to truly become their own director by moving tablets or scrolling across the screen to see whichever part of an event they wish.

“There are no solutions in existence that match the needs of the broadcast market,” explained Fouchard. “We don’t intend to offer this as a service model but as an end-to-end solution that puts these tools in the hands of the same production team producing the HD or Ultra HD main feed.”

Compression specialist ATEME is developing LiveSphere in partnership with Kolor, developer of 360º photo and video stitching software, and Finwe, an expert in mobile interaction.

The aim is to create a turnkey broadcast system that captures, stitches, streams and displays video to provide an immersive consumer experience for second screens, including virtual reality headsets – although Fouchard wants to play this aspect down.

“We don’t want to give the impression that 360º video is just for VR where you have to wear headsets, because this is the issue that proved problematic for the public perception of 3D,” he said.

At Roland-Garros, a rig comprising seven HD GoPro cameras was tested in positions on the roof of the Philippe Chatrier court, and also in the crowd and then closer to the players.

“The roof position didn’t work – we had too much sky for a live four-hour broadcast,” he reported. “The best pictures were those captured close to the action of the players. Eventually we would like to fix the camera rig to the referee’s chair.”

Difficulties in positioning the rig for a close up, central field of view in large stadia make sports like tennis and basketball and even horse racing more likely candidates than soccer, at this stage. While the current technology, which stitches multiple streams together, affords an output resolution of less than HD resolution, the goal is to acquire at 4K, 8K or greater.

“You need a high enough resolution so that when you zoom you still retain a decent resolution to see the faces of players,” he said. “That demands an enormous amount of data which, in the short term, is not possible and is also why in the short term the application is best suited to smaller arenas.”

A multiple camera rig is not necessarily a permanent solution, but it’s the best one for now he believes. “Single camera panoramic video doesn’t work for broadcast,” said Fouchard. “It may be okay for Google to film street views with a single camera but the video is all on the same plane and is not of high enough quality for broadcast. In addition, single cameras can be extremely sensitive to changing light conditions. With multiple cameras we are in a much better position to produce a better, more consistent picture when the images are processed together.”

How LiveView works

To produce a 360º broadcast, a scene is shot with multiple cameras with wide-angle lenses at up to 500 megapixels per second. These various, overlapping views are then stitched together into a single high-resolution video sphere using Kolor’s image recognition algorithms in which even the camera pole disappears.

The secret sauce, however, is from ATEME and concerns the ability for viewers to choose and dynamically modify their field of view by either moving the display or touching the screen.

“We cannot stream just one particular field of view to a device and then stream something different when the viewer moves the angle because the delay would make the broadcast unwatchable,” said Fouchard. “The only way we can do this is by streaming and decoding the entire sphere of view in the device and maximising the device’s own decoding and display capabilities.

“The technology has generated a lot of excitement, but we are being cautious,” he added. “We are trying to build it as a long term professional broadcast application not as a gadget or gimmick.”

To this end the developers are exploring new advertising properties based on the technology, including targeted hotspots of logos and brands or pop-ups activated when the viewer scrolls over the screen. “The type of advertising you can offer in this type of environment is really new,” he said.

Other use cases being explored include enabling music fans to select and zoom in on one artist at a concert, stay on that artist or turn around and watch the audience.

Reality TV viewers might navigate on and off stage, take the public’s or the presenter’s view or have a look at their favourite character, regardless of camera angle.

Crime scene investigation units are reportedly using such technology in Holland as an aid to crime scene reconstruction. Property developers are also very keen as an aid to marketing.

France’s Kolor markets Autopano Video, a software application that uses the company’s algorithms to apply image stitching to video content.

Founded in 2006, Finwe offers software products and services for mobile panoramic video including an interactive panorama video player.

Please note that the images in support of this article are an effect only and not how the viewer would see the final rendered LiveSphere output.