At TVBEurope’s recent IT Broadcast Workflow conference, the subject of Quality Control was frequently mentioned as a major element in the digital workflow process. However, contacting suppliers of ingesting and archiving products for this Forum has revealed that QC is often an add-on to other solutions rather than being an end in itself. As a result, some of those vendors felt unable to take part. So, should QC be a ‘standalone’ process? And with loudness being an ongoing issue, how effective are QC systems at handling the matter?
We discuss these and other issues with (in alphabetical order) David Ackroyd, Business Development director at Omnitek; Simon Begent, sales director of VidCheck; Bruce Devlin, CTO at AmberFin (pictured); Niklas Fabian, product manager, Rohde & Schwarz DVS; Manik Gupta, product architect, Interra Systems; K.J. Kandell, senior director, Media and Entertainment Products at Nexidia; Robert Millis, senior product manager – Compressed Server Systems at Harris Broadcast; MC Patel, CEO Emotion Systems; Vikas Singhal, executive director, Business Development at Veneratech; and Owen Walker, head of Product Management, Root6 Technology.
Should QC be a ‘standalone’ operation or form part of other processes in the ingest-to-delivery chain?
Ackroyd: Random sample-based QC can provide a good balance between 100% QC on all ingest and ad-hoc QC. This keeps suppliers aware of the need to comply, yet can minimise the time and expense of providing a full QC net.
Begent: QC should be an integral part of any workflow — in post production during edit, and after the file has been rendered before delivery to the broadcaster. For the broadcaster on ingest, after transcode and on the playout server before broadcast.
Devlin: It seems only the media industry thinks QC can be a standalone process. When you buy milk, you look at the metadata on the carton and if the date is greater than the date on your clock, you assume that all sorts of QC has been done upstream. Then your final ‘sniff test’ QC will be sufficient to ensure that drinking the milk won’t kill you.
This QC metadata is, if you like, the ultimate in metadata compression. By comparison, many in the media industry want a software QC tool to dismantle a 100GB media file, analyse the video, audio, VANC, metadata, wrapper, synchronisation, loudness, blockiness, colourimitry and a hundred other parameters in zero time for zero cost. And then report the results with a single traffic light that covers all the business risks of the processes that have taken place upstream.
Unrealistic? I think so, too. We need to think of QC as the application of periodic measurements to assure the quality of material through a cascade of processes. QC is not test and measurement. AmberFin’s QCML document can take multiple tools with multiple measures at multiple phases of the lifecycle of the material and portray them in a way that requires almost no training to interpret. Sounds too good to be true, but our customers see the benefits!!!
Fabian: We would set the visual QC apart from the technical QC. While visual QC focusses on the actual impression, the technical QC takes all those aspects into account. Thus, technical QC is always mandatory to all tactical workflow steps, such as ingest and delivery, and should be part of this respective process. The visual QC, however, would be a standalone operation at the end of the production chain since the technical checks would not be sufficient.
Gupta: File-based workflows have become reality and with that QC has moved from being a point /standalone operation just before playout or transmission. QC needs to be integrated with file-based workflows so that it is performed at each stage of the process. Early detection of any issue can prevent a cascading effect on content quality at later stages. Baton web services can be tightly integrated with the asset management systems so that QC failures at any workflow stage can be sent out as alerts, allowing necessary action to be taken.
Kandell: The efficiency of the chain of tasks from ingest to delivery can be significantly improved by treating it as one long file-based workflow, whereby components should be loosely coupled via integrations, methodologies, and applications that are suited to the task.
Millis: There is no one right answer. QC has to work well with existing customer workflows. Many QC checks can only be done after the material is completely written as a package. You want to apply the right ‘level’ of testing at various touch points. Certain things you look for at ingest will likely be different than what is done after transcode, creative adjustments or pull from archive.
Singhal: While QC systems can work independently, the real automation is possible with integration inside the workflows. While Automated QC vendors can work with off-the-shelf management packages to provide integrated offerings to the customer, the QC offerings provide APIs for easy integration with home grown management systems.
Walker: QC can be performed standalone, but for maximum efficiency it should be part of the overall workflow. For example, before editorial is performed, clips can be checked to make sure that they are in the right wrapper and video/audio format. If not, they are transcoded or rewrapped to the required editing format. When finishing/mastering is completed, visual QC can be done to check that titles, for example, are correct. Once final master broadcast deliverables have been created, again, QC can be performed to check that the file-based delivery specification has been met.
Has the growing practice of shooting and editing one’s own material – perhaps with a creative, rather than technical, eye – brought about an increasing need for even more in-depth QC?
Ackroyd (pictured): While digital acquisition has helped to make video signal levels correct, multiple formats bring errors relating to aspect ratios, shoot-safe areas, up/down conversion and audio framing.
Devlin: Yes. It has brought about the need for more QC and also for more fix-up tools. One contributing factor is that the material is likely to be seen on more devices in more resolutions and more variants than before. A small production house that shoots in film mode in the US, edits in video mode and then naively exports a 1080x50i project for the EU is likely to have a significant number of QC failures in content — simply because their technical knowledge is not as good as their creative abilities. ‘This should never happen’, I hear readers cry. But they’re not crying as loudly as the poor broadcaster who has to try and fix it. Thankfully some vendors are on the ball and have solutions to these inevitable problems of supply chain complexity.
Fabian: On the one hand, the diversity of formats has increased considerably. But on the other, QC is seldom carried out, when the delivered material has its source from either camera or camcorder. Usually, this material is very reliable and will be bypassed into the central ingest immediately. Then again, material from other sources that has been edited, versioned or exported definitely requires an in-depth QC.
Gupta: Yes, editing could lead to quality issues in content, of which a person shooting/editing may not be aware. We have seen a lot of Baton QC users facing quality issues because of bad editing — Field Order error, RGB Colour Gamut issues, Video Signal Level issues, Field Dominance error, and Audio loudness. This is precisely where QC checks help these media companies ensure that content meets specification.
Walker: A common mistake these days is to set cameras to shoot at the wrong frame rate or set the audio to record at the wrong audio bit depth. Within our ContentAgent, product rushes and master delivery files can be automatically checked for these common mistakes and then corrected by performing a high quality frame rate conversion and converting the audio to the correct bit depth.
Is faster than realtime truly efficient?
Ackroyd: File-based processes can operate at the fastest speed the software design can achieve, whether slower or faster than ingest/playout speed
Devlin: Yes. I think the question should be ‘Is faster than realtime always needed?’ – in production the answer is probably yes, but in reality, the ‘dismantle the file at the end of the workflow’ approach to QC is ultimately doomed.
Unless we learn from other industries that QC is a process relying on accurate measurements at key points within the workflow then we’ll never be able to crack the brute force realtime question. If you want realtime, the best way is to be smart about where, when and what QC you apply and then how you aggregate that data to mitigate business risk.
Fabian: Again, this is a question that leads to controversies in the industry. We believe that speed is essential to certain production fields, for example, in the news sector. Realtime-readiness or even more is truly important in this hectic environment. Other fields might not require this certain aspect and have more time in their workflow steps and define efficiency on another level.
Kandell (pictured): Realtime is a carryover from the tape-based world, but as applications and hardware continue to increase, we will see this benchmark to be less than acceptable. Nexidia QC runs at 30-100x realtime, offering customers the ability to meet their increasingly tight deadlines for production or the requirement to QC large libraries of content in a timely manner.
Millis: Some workflows are based more on speed, such as transcode workflows or those related to live programming. It is always handy to get done the automated QC faster to either allow time for manual review or potentially a resubmit of a QC job if some parameters are less than optimal.
Singhal: The main objective of ‘Automated QC’ is the improved workflow efficiency. If the same analysis task can be done faster, the benefits are obvious. Pulsar Automated QC solution can process a HD file up to four times realtime speeds which allows tremendous amount of time and cost efficiency to its users.
Finally, what will be the next innovation when it comes to Automated QC procedures?
Ackroyd: That would be telling! Seriously, this is likely to be focussed on attempts to standardise required QC data into formats/file locations that can be checked efficiently and quickly to aim at 100% — rather than random — QC.
Begent: Auto correction as well as auto QC is already available and will become more and more important. Recent systems like VidFixer can QC, correct and transcode to different formats all at the same time and all in the one product. Faster throughput and more comprehensive checking particularly of small video drops and glitches which are challenging to find – for example, is that a small visual artefact in a tiny part of the frame or is it part of the scene?
Devlin: It’s already here. Multi-stage QC where you propagate the results and signatures from upstream to be used downstream. Remember my milk analogy? An AmberFin QCML report is the summary information of multiple stages applied in a way that you can see a simple red/amber/green indicator for management, or drill into any parameter from any stage of the workflow for the post-mortem engineer.
Millis: The most important step for file-based QC is standardising testing profiles and the results so that digital asset management and associated business systems can manage workflows with less human interaction. The EBU’s QC working group, of which we are members, is handling the lower level specifications. The EBU’s FIMS is working the higher level interoperability issues.
Singhal: There are scenarios where Automated QC is not fully automated. One of the primary reasons is cognitive nature of checks in such scenarios. Achieving true automation in these difficult scenarios requires the next level of innovation. This is an uphill task and QC systems need to make continuous improvements in handling these faults.
Walker: The next leading innovation in QC processes will be to analyse creative decisions such as misspelled titles and clips being incorrectly framed — and report these as problem areas. In some cases these errors could be corrected automatically. However, the QC tool should allow a customer the choice to auto correct or not — I am sure some customers will prefer to manually fix such errors!
By Philip Stevens