Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


UHD, QC and AVB: unscrambling IBC

Excitable sellers set out new tech faster than they could answer the deluge of questions arising from standards makers and prospective buyers. If ever there was a need for a roadmap, IBC2013 was it.

Excitable sellers set out new tech faster than they could answer the deluge of questions arising from standards makers and prospective buyers. If ever there was an event in need of a roadmap, IBC2013 was it. Mark Hill (pictured) is our guide.

With the fire-sale on 3D TV over, the shelves at the Amsterdam RAI had been swiftly restocked with 4K TV — or ‘UHD-1’, in European TV parlance. Like some ‘pop-up’ store, here were UHD-1 cameras, servers, switchers, encoders and (the suspected root cause of all the disruption) displays.
UHD is set to fare better than 3D, as the former is essentially just more of the same that TV professionals already handle on a day-to-day basis: no Z-axis, dual optics, dual paths and unfashionable spectacles. Despite what many on the show floor would have us believe, however, we are a long way from 4K being ready for primetime.
Beyond our mutual nod to, and liking for, the obvious spatial resolution improvement of UHD-1 (let’s agree not to mention UHD-2 specifically) over today’s HD, there is a lot of work to do if it is to be put into practice. We cheer, as UHD brings with it the death knell for our elderly, analogue compression fiend, interlace. All UHD pictures will be displayed progressive scan and, crucially, acquired using cameras capturing whole frames at a time rather than making a frame from two fields, captured successively.
Of course, this will not relieve the industry of the challenge of having to add into the edit legacy content acquired as interlaced and in inferior resolutions. Moving the job of de-interlacing (and up-scaling) of legacy content wholly to the broadcaster environment, rather than the domestic one, will at least guarantee that far more money gets spent on these demanding processes, with better results.

Quality brings quantity

Committing to gathering frames brings with it an opportunity to set the frame rate. In Europe, there is a good measure of agreement that the minimum frame rate for UHD should be 50fps, while as much as 300fps has been suggested as beneficial. If UHD is going to take off anytime soon in Europe, 50fps looks like a reasonable starting point, with perhaps 120fps as a future (worldwide harmonised?) step.
UHD colourimetry is also set to benefit from an update, now that we are no longer constrained by the chemistry of the glowing phosphors of the CRT. An opportunity also exists to increase luminance and colour signal resolution from the current 8/10-bit norm to 12-, or even 16-bits. Such a move would pave the way for High Dynamic Range (HDR) working.
Crudely summarising, the HD to UHD-1 upgrade equation looks something like: [Increase spatial resolution = 4x bitrate required] x [Capture pictures faster = 2x to 12x bitrate required] x [HDR/increasing bit-depth 1.25x to 2.00x bitrate required] = 10x to 96x bitrate required. This might be compared with a figure of 2x for moving from interlaced- to progressive-scan HD, all other things being equal.

Compression to the rescue

UHD represents a massive increased demand for bandwidth, processing power and storage volumes, all of which add up to money but rarely in a simple, linear fashion. To lessen the shock on the bottom line, we look again to video compression technologies to reduce the number of bits to be processed, stored and transported at the many points in the ‘glass to glass’ experience. Even at this early stage, it seems likely that base-level UHD-1 pictures can be coded for transmission so as to occupy as little, or no more, bandwidth as today’s HD pictures.
Those having 1.5G HD infrastructures at the heart of their station must now be wondering whether there will ever now be a need to upgrade to 3G-capable ones. 3G infrastructures can accommodate HD (1080p50/60) working, but if your operation is based on HD (1080i25/30), are you really going to move to HD progressive as your next upgrade? Despite the appearance on the show floor of ‘6G’ coax-based interfaces, signal bandwidths for Ultra HD suggest that the Serial Digital Interface (SDI) has run out of road. Yes, for UHD you could start cabling equipment up with multiple paths of 3G and 6G coax interfaces, but we gave such practices up a long time ago with analogue component video and surely have no desire to return to those days.
UHD heralds the move away from broadcast industry proprietary, copper cable-based SDI and bespoke networks and towards the increasingly fibre-based interfaces and generic data networks of the IT world. This will represent a step change for systems integration (SI) companies and the installations they deploy in future.
Interconnects in fixed installations will move from cut length, terminate on demand copper cables, BNC plugs and jackfields and towards off-the-shelf, pre-terminated lengths of single-and multi-mode fibre, plugged into optical interface ports. Faster, lighter, more eco-friendly. With the change of the physical comes adoption of further layers of the OSI seven-layer model and firmly into the world of IP/IT.

Quality Control

IBC saw the launch of the EBU ‘periodic table’ of QC criteria for file-based AV content. Circa 160 tests have been scoped, with 50 or so defined in detail. Identified tests fall into four categories – Regulatory, Absolute (against a particular standard), Objective (quantifiable, but not necessarily having an applicable standard) and Subjective (may require human eyes and ears) and exist in one, or more, layers – Baseband, Bitstream, Wrapper, Cross-check (that values in the other three layers are in agreement).
Ten vendors are actively participating in the surrounding project and the results are expected to dovetail into the next part of the Framework for Interoperability of Media Services (FIMS) initiative and become part of UK Digital Production Partnership (DPP) AS-11 standard for programme delivery to the major UK broadcasters.
Ongoing arguments about how much longer movie images will be captured on film media and whether cinema audiences should continue to be served the ‘film look’ aside, 4K/UHD-1 brings with it the best opportunity yet for convergence of master image capture between movies and TV.
Movie content forms an obvious and ongoing source of potential content for UHD-1 broadcasters. The source most often suggested at IBC, however, was live sports. The exchange of signals at baseband has always been a staple of live sports production. Equipment interfaces are (or have been to date) ubiquitous and content moves, predictably, in realtime.
One of the great hopes for file-based working has always been that it would be somehow faster than baseband (and tapes), faster than realtime. In practice, with the large file sizes associated with high definition content, this hope is not always realised. The cumulative latencies in processes of nonlinear editing, transcoding, file-packaging, staging, QC, network transfers, more staging, anti-virus checking, even more staging, can be considerable and act to constrain time to market for, say a highlights edit of a football match. Where significant collaborative working between production partners and service providers is involved, this matter is already a concern with HD pictures. Expect it to become a real issue with UHD.

Simple things take time

The playout/master control environment was always the obvious candidate as an early adopter of file-based working. For most broadcasters, the replacement of VTR-based technology with video servers for short-form (eg commercial spot) replay has long since happened and was accomplished relatively swiftly. In 2013, delivery of short-form content as files is the norm, however much trafficking/delivery of long-form (programme) content remains on videotape. So what news of the work-a-day replacement for these transmission-ready videotapes, with their attached record reports?
One notable success in this regard is that of the UK Digital Production Partnership (DPP) in its drawing together of major UK broadcasters around a common (and refreshingly concise) set of technical requirements for file-based programme delivery. Key to the success of the DPP AS-11 standard, as it is known, is its simplicity and recognition of the crucial part that metadata has to play in the modern, broadcast ecosystem. Boiling down the ‘standards soup’ into only two base format choices and utilising the combined experience of practitioner participants has delivered an accessible and very practicable result.
While on the subject of DPP at IBC, it also recently released the last in the trilogy of its ‘Revolution’ series of reports, this time aimed at shining the spotlight on another piece of Amsterdam fluff – the cloud.
Where once we used to debate what was meant by Media Asset Management (MAM), we now do the same for the cloud. Should the cloud be characterised as something technical – storage, networks, software applications, or something more editorially-friendly? Either way, the vibe from visitors on the show floor and the conclusions of the report were in general agreement. The forecast for usefulness of the cloud come IBC2014-time remains, well, cloudy.

(Mark Hill is an industry consultant at Ixedia Ltd,