Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Content storage and delivery: Solving infrastructure issues

Unlike standard enterprises, whether you are providing hybrid storage solutions or distributing applications, videos, or images, or creating a networked DVR infrastructure to

Unlike standard enterprises, whether you are providing hybrid storage solutions or distributing applications, videos, or images, or creating a networked DVR infrastructure to millions of end users, the infrastructure to support your business… is your business. Managing millions or billions of files and hundreds, if not thousands, of file types from a myriad of different end users and applications is a significant challenge for cloud and service providers, and one that requires significant time and investment in supporting storage infrastructure.

Further, every day each file is getting larger and being produced more frequently, driven by the proliferation of digital content and machine-generated data producing significant challenges for cloud service providers. Today, growing capacity or performance within private and public cloud and data centre architectures means selecting either scale-out or scale-up approaches, respectively, that are both costly and inefficient to scale and require significant over-provisioning to be effective for any meaningful period of time.

Cloud hosting and service providers grapple daily with the challenge of meeting the demand for more storage from their customers. What the challenges underscore, in my view, is the fact that traditional storage infrastructures for cloud and web were not specifically designed to meet the needs of today’s service providers and are not fit for the purpose at today’s scale.

Think for a moment of scale-out web environments, where users will store a mix of files for varied uses and audiences from small web accent images to the largest HD video files. In this environment it is very difficult to predict data types and sizes, data locality, consumption patterns, and performance requirements – meaning that storage infrastructures need to be more than just a repository optimised for one type of file. They need to have the intelligence to understand their content attributes and be able to handle the diverse files, workflows and traffic loads.

Case in point
Let’s take the example of high-resolution media broadcast production to demonstrate the significant hurdles an organisation needs to overcome to deliver content from the creators all the way into the home. Media broadcast productions require solutions that accelerate the creation, sharing and long-term access of content across the entire workflow. Production studios, broadcasters and other customers in the media and entertainment industry are looking for strategic alliances with their vendors so they may focus on the creative part of the process, without having to worry about bottlenecks anywhere else along the complete journey.

Additionally, the entertainment industry is dealing with the rapid change in broadcast TV viewing habits. One of our customers, Level 3 Communications, must deal with this exact challenge. Andrew Adams, product marketing manager, content solutions, notes:

“Today’s media conglomerates must take into account the changing market of video consumption as consumers move from traditional broadcast TV models to over-the-top video delivery. Level 3’s Origin Storage Platform, with over 46PB of content including entire customer libraries of TV shows and movies, feeds our Global Delivery Network to deliver video to today’s cord cutters and ‘cord-nevers,’ who prefer their content on their time. Having a high performance, globally redundant origin platform has been a value to our customers who move to today’s proliferation of IP-based video consumption.”

Consumers expect a broadcast TV quality experience online and the storage for creation as well as delivery must be flexible, highly scalable and reliable to make that happen. Level 3’s storage supports the creators and the consumers in a single, dynamic, intelligent solution.

Object storage saves the day
While traditional file-based storage is still a smart option for some companies, today’s file based storage platforms struggle to scale effectively enough to meet the explosion of storage needs. Not only do file-based storage platforms fail to scale sufficiently, they’re also becoming obsolete as more and more applications are designed to use new connections protocols like Amazon S3, OpenStack SWIFT, and RESTful API’s to talk directly to the storage, without the overhead of file system layers in between.

An alternative to Network Attached Storage (NAS) and Direct Attached Storage (DAS) is Object Storage – essentially just a different, optimised way of storing, organising and accessing data on disks; it is a storage infrastructure to store objects, which are essentially files with lots of metadata added to them. When objects are stored, an identifier is created to locate the object in the flat namespace storage pool. Users access object storage through applications that will typically use one of these new connection methods that are optimised for online applications. Further, object storage scales simply and easily, adding 100s of TB in minutes. Object storage also works equally well for large and small files, accessed linearly or randomly. This makes object storage ideal for all unpredictable online, cloud environments.

But don’t just take my word for it – major cloud application providers such as Amazon and Google, as well as Facebook and Twitter have deployed object storage to meet the requirements of their fast-growing user base: billions of users are storing trillions of objects in infrastructures that can scale infinitely and still perform with the lowest latency.

Latency is the key to delivering to end-users
As I mentioned earlier, the investments in infrastructure are all about creating efficiencies to deliver content effectively to the end-user. And that’s where you find the final hurdle – users could be located anywhere in the world, whether it be in a large city or in the furthest reaches of the highlands of Scotland, and they are not forgiving. Just a few moments of delay or buffering and they are off to another amusement, application or website.

If you think it might be difficult to deliver HD content to thousands or millions of users in London where the infrastructure is very substantial and so latency should not be so much of an issue, then think how you deliver high-resolution movies into millions of homes in rural areas. Service providers and internet companies need to be able to deliver both – they simply can’t say they will only serve certain geographic areas.

So, the challenge becomes one of delivering massive scalability in performance and capacity in distributed data centres, which these organisations have, but also optimising the latency of the storage and of the network that is the conduit into the homes where the content has to end up.

Of course, it’s a problem we’ve already addressed with our customers using intelligent algorithms that sit inside our appliances and are embedded in our technology so service providers, media companies, and data centres can ingest and distribute content very quickly and at massive scale in a solution that gets more efficient and cost effective as it scales larger.

It can then be replicated in real time across multiple data centres so organisations can deliver movies, games, pictures, and every other conceivable piece of content in the most cost effective way possible.

This is no longer a trend – the delivery of content to a consumer population through the internet is now like water or electricity, it is a human necessity. As such, service providers’ infrastructures must address scalability, power and density in the data centre, whilst also being mindful of the latency and costs, in a business model that will be profitable and lasting for the foreseeable future.

Whatever your business, delivering content to end-users is about being able to capitalise on secure, scalable, easy-to-deploy and manageable platforms for private, public and hybrid cloud storage solutions. Storage is a critical component in your next generation infrastructure.

With quite literally billions of new files being created every day, cloud companies and data centres need to be constantly looking at ways to deliver new efficiencies that can speed up the process of ingest and output which, ultimately, is about delivering the best experience to creators, distributors and end-users.

By Molly Rector, CMO, EVP product management and worldwide marketing, DDN