Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Could artificial intelligence replace the need for camera people in live sport?

TVBEurope meets Gal Oz, co-founder and CTO of AI video technology company Pixellot, to learn how they're transforming the way sport is produced

As it approaches its 10th anniversary next year, artificial intelligence (AI) video technology company Pixellot continues to go from strength-to-strength.

Last month the company announced it had raised $161 million in additional funding as it looks to accelerate its global expansion and develop new innovations.

The company’s technology automatically captures live footage for broadcast and social content, as well as providing detailed data analytics, with the system currently being used in 23,000 courts and venues in 70 countries.

Pixellot was founded by the Dr Miky Tamir and Gal Oz in April 2013 when they wanted to try capturing live sport without the need for several cameras following the action. “Pixels were getting better and better and cheaper and cheaper, and we thought ‘let’s capture the full game all the time, let’s automate production,” Oz tells TVBEurope. “It took us about a year to develop, and to begin with we had some more advanced production flows that required a human in the loop. But eventually we found that the sweet point is not the panoramic capture but the automatic part of it, using AI to understand the game and follow the production.”

Currently Pixellot’s tech is used across 17 different sports, including football, basketball, American Football, ice hockey, tennis, and cricket. The idea is that a broadcaster or sports league can install one Pixellot device at the venue which uses several cameras to capture the whole field. “It can be one camera, two, four,” explains Oz, “in the past we’ve had up to 12 cameras in the device.

“In a typical device, you have two cameras. We have several models, some work with a big field, some a smaller field, and we have different level of products, but the common theme between them is that there is always fixed cameras that are not moving during the game, and they cover the whole field.”

What makes Pixellot’s device stand out in terms of outside broadcasting is that it doesn’t need an OB van. The company installs a processing unit at the venue which links to a computer on site that does all the processing and generates the HD feed. “That is then fed to the cloud, and from the cloud it can go to a client’s OTT channel, YouTube, Facebook, whatever they want,” adds Oz. “We can send it to an OB van if needed, but most of our data is distributed over digital platforms.”

In general, the device uses cameras developed by the 20-strong innovation team at Pixellot, but they do occasionally use an off-the-shelf camera. “We look at the angles of the camera, and the environmental behaviour of the camera, because we need to make the installation simple,” adds Oz. “We want the camera to last 7-10 years once it’s installed in the venue, and installation is not much more complicated than installing a security camera.”

As the device captures the whole playing field for whichever sport it’s working with, Pixellot’s artificial intelligence then analyses the game to determine where the ball is, and where a player is. “We use a different AI for a basketball game compared to football game,” states Oz. “It mimics the camera person’s decisions as to where to focus. We’re generating a live TV production but with no camera people, no director and in a very large scale. Today we are producing more than 300,000 hours of live content per month. And all of it is done without any human intervention. It’s all fully automated.”

“If you look at all the sports leagues, tournaments etc in the world less than one per cent is currently being televised. We are mainly focused on the 99 per cent which is not televised,” he adds. “So, I don’t think that there are many camera operators who will lose their job because of us. We’re giving content owners a lot more video to build content. It’s important to say that we are not replacing production. We are adding a lot more content and helping to democratise live sport.”

Having spent the last nine years developing their technology to reach this point, Oz says the company will spend the next nine years to get to where it needs to be in that space of time. “It’s like a never ending story,” he continues. “Today we’re supporting 17 type of sports, last year it was 16 and before it was 15. It usually takes us two to four months to develop the AI for a new sport. We usually start off installing a beta version and then after a few months we can collect more data and fine tune it. We’re also always working on updating the technology.

“Football was one of the first sports we worked with and last year we did a big project on changing the technology we use. So it’s a never-ending story. In order to be in the front end and be the best we need to continue developing the AI for both new sports, but also keep improving it for sports we’re already working with.”