Post production houses today face significant operational challenges. These include handling multiple ingest points and types, different formats captured from a range of cameras, the use of various tools, and a mix of creative professionals and producers that all need to share content in order to get projects done. To be able to do all of this successfully, they need a workflow in place that’s efficient, but also empowers their in-house talent creatively, so clients can get the most out of their ever-tightening budgets.
The optimisation of these workflows is becoming increasingly important, particularly as they have evolved to now depend so much on the infrastructure that supports the creative tools used in a traditional post production facility. Modern media workflows are especially sensitive to the performance of storage and a common concern is whether data is going to be in the right place at the right time, and if it will be delivered to applications in the right way.
Enabling collaboration, while overcoming problems of scale, and making content easy to find, access and manage within an intelligent environment that delivers the benefits of different storage technologies is not easy to achieve. But with the growing demands being placed on those involved in the post production of content, it’s becoming a necessity rather than an ideal-world scenario.
Empowering collaborative workflows
The ability to share content is crucial to successful collaboration within a post environment. A collaborative workflow therefore needs to connect multiple users over any protocol from any OS client to a single centralised shared storage environment. It also requires a file management system that understands how to efficiently manage access from multiple users to the same content at the same time.
Collaboration is at its most efficient when all of the users are working within a single interface through a common global namespace for their content. Any movement associated with data that has been tiered and needs to be pulled back into primary storage should all be handled automatically. This saves valuable operator time while removing the potential for human error.
Addressing the problems of scale
As the media market continues its never-ending progression of media formats into higher resolutions like 4K and 8K, the importance of scaling both performance and capacity continue to drive how people are looking at designing their storage infrastructures. Traditionally, the scaling of these has been a complicated and expensive process and dealing with changes to the workflow can add a lot of unwanted and unnecessary strain.
The best solutions however allow customers to scale performance and capacity independently if needed, but also easily and efficiently, in a non-disruptive fashion.
A flexible approach to archive and media management
While archiving is generally about long-term data preservation, it’s increasingly important that users are able to find content quickly when it’s needed and easily retrieve it. Archiving has traditionally been to tape, and tape is still the most cost-effective method of storing something over a long period. It’s also playing an increasingly important role in data protection, as additional copies are created and stored off-site as backup in case of disaster or a security breach. But new technologies like object storage and the cloud are evolving to provide alternative archiving capabilities.
A tiered approach to storage can incorporate a range of these technologies and provide greater flexibility within an archiving strategy. It lets users mix and match, depending on the requirements of each job, and can easily adapt as technology or the workflow changes.
An archive fit for today’s post production facilities needs to be flexible, offering a way to store content at the lowest possible cost, but also has the ease of accessibility that a primary storage system delivers. It should also be adaptable so that adjustments can be made on the fly to deal with unique demands from individual projects. Types of media formats, number of users, the amount of concurrent streams needed, and how much content is being captured should all be considered when planning data management and data archive for post.
Media management sits alongside archiving within a modern storage infrastructure, but is concerned more about overall data management, and the movement of files, than the archive storage hardware itself. The best environments are structured to align content to the workflow and having a storage infrastructure that can automatically ensure that data is always in the right place at the right time.
Building the right architecture
The most effective architecture for today and tomorrow’s ever-changing media workflows is a multi-tier storage environment that is able to optimise for both performance and lowest cost at the same time. Users ideally want high-performance storage at one end of the content lifecycle and low-cost storage at the other, with a single namespace that ties everything together and automates the movement of data between tiers based on policies.
Although different independent storage technologies have their own pros and cons, being able to link them together seamlessly means users can benefit from the advantages that each one brings yet also enjoy the simplicity and efficiency that comes from interacting with a single storage interface point.
The eternal storage conundrum is that you cannot optimise for performance and cost at the same time, but with a tiered infrastructure, you can bring them together, getting the benefits of each in a single common environment.
As ambition grows and budgets shrink, post production facilities have never been under greater pressure to deliver a more efficient, yet creatively-empowered service. Fortunately for them, through the optimisation of workflows that incorporate the supporting infrastructure, teams can be more ambitious, collaborative, and effective than ever before.