Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Is software eating broadcast technology?

Steve Plunkett, chief technology officer at Red Bee Media, looks at the likely challenges and opportunities of a software-based future for broadcast companies.

Steve Plunkett, chief technology officer at Red Bee Media, looks at the likely challenges and opportunities of a software-based future for broadcast companies.

A little over a year ago, I posed the question via a blog: Is Software Eating Broadcast Tech? This is a play on the term coined by Marc Andreesen that highlighted how software was disrupting, or eating, many industries. So how does this question look a year later? Based on the number of traditionally hardware-centric vendors announcing or showcasing their software based product roadmaps at this year’s NAB, it’s looking like a resounding ‘yes’. Particularly in the media processing and playout areas, both historically dominated by hardware delivered products, everyone seemed keen to stress their future credentials as providers of software solutions that would run, variously, on commodity IT hardware, virtualised infrastructure and ‘the cloud’.

This is all good news, but it is not something that can happen overnight or with little effort and no teething problems. Here are my thoughts on the upsides, downsides and likely difficulties ahead (with a tip of the hat to Sergio Leone).

The good

Software-based products will ultimately deliver much greater flexibility. We can deploy them on infrastructure that suits our purpose (cost, speed, location, vendor preference, etc) and we can do so dynamically – such as using common compute resources to perform an Auto QC at one moment and then re-allocating them to perform a transcode at another. This allows us to either get much greater efficiency from our private infrastructure, reducing its physical density, optimising power consumption and so on, or pay a third-party such as a public cloud provider only for the resources we need at any particular time.

We can perform the deployment much more rapidly than in the physical product world too. Need a new channel? No problem, choose from a menu of service options and it will be ready to receive and publish content in minutes rather than weeks or months. I am hugely over simplifying this, of course, but you get the idea – in a software centric world, the effort takes place before a channel deployment through software integration of channel templates; the act of deployment to available virtualised hardware is a process of launching machine images and/or configuration profiles not a large installation project. This should result in reduced costs, greater business agility and high fives all round.

The bad

Unfortunately, this transition to software nirvana is non-trivial. Writing good quality, performant software to process audio and video, particularly in real-time, is hard. We know this already because such software already lives inside the hardware products we have used historically.

But hardware delivered products should contain what is sometimes called ‘mechanical sympathy’ (a term coined by Jackie Stewart to describe a driver being at one with a racing car by understanding its hardware intimately). The product designers and developers know exactly what to expect from their chosen hardware stack, and optimise accordingly. In the software-running-on-hardware-of-your-choice world, things are less certain.

Different CPU implementations will provide different functional and performance characteristics through clock speed variations, cache sizes/levels, multi-core architectures, internal bandwidth, specialised accelerators; and that’s just the CPU. Mix in storage IOPS/throughput variations, memory size/bus bandwidth differences, GPUs, LAN throughput, OS and driver versions… the list goes on. In this new world, the software will need to be capable of optimising for generalised rather than specialised hardware and vendors will need both to provide guidance on minimum system requirements and accept that supporting a much more diverse underlying infrastructure will be more complicated and costly.

Then there is the question of how best to combine software components to perform multiple operations on the media. For real-time delivery, the serialised model of video processing, necessitated by using discrete hardware components, interconnected via SDI, isn’t likely to be optimal in the future. Instead of emulating this architecture by interconnecting virtual machines with IP interfaces (though the most viable approach in the short term), it would be attractive instead to cluster multiple operations on the media being processed while it is in memory on a single machine. This is an obvious benefit of integrated systems such as Channel-in-a-box solutions.

Unfortunately, this is an area lacking in standardisation, or even interoperability, so today that often means taking the whole integrated system from a single vendor even though they are likely to be good at some functions (probably based on their product heritage) and not so good at others. And because increased software complexity often pushes the boundaries of reliability, the result of more functional integration can be system fragility. Not a property that sits well with the high availability familiar to, and expected from, the broadcast television industry.

The ugly

If we know where we want to go (the good) and we know it’s not going to be easy (the bad) then the transition period over the next few years is likely to be ‘ugly’. Customers who have bought into the vision will be impatient to receive it and thus put pressure on vendors to deliver soon. The vendors meanwhile will inevitably find it is harder, takes longer and costs more money to get there than they estimated (or at least publicly communicated). There is an old software development maxim that applies here: you can have it faster, cheaper or better, but you can only choose any two.

The nature of this type of industry disruption tends to lead to a change of fortunes. Some companies will fare better than others in the transition. This is all happening against a background of industry consolidation, it’s one of the reasons driving it, and it will be tough for certain categories of companies. Large companies may be better able to absorb the costs required to successfully evolve and small companies may be nimble enough to adapt more quickly, but those in the middle might find it a tough road ahead.

Pretty much every vendor at NAB said they would have their new software products ready for business ‘by the end of the year’ if not before (or here already). Let’s make a note in the diary to see how things look a year from now.