Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Storing the future: Quantum’s look at 2014

Last year, Quantum helped London post house The Ark revolutionise its approach to data and storage. Quantum’s head of product marketing EMEA, Laurent Fanichet, takes a look at what’s in store for storage in 2014

Last year, Quantum helped London post house The Ark revolutionise its approach to data and storage. Quantum’s head of product marketing EMEA, Laurent Fanichet, takes a look at what’s in store for storage in 2014

Time to retire the “primary” from primary storage
With the continued growth of data and increased strategic value of connecting historical data with new data, primary storage is no longer the main game in town. Getting data off expensive primary storage, while keeping it readily accessible, will take on greater importance. As a result, there will be increased focus on tiered storage, with new technologies such as next generation object storage and linear tape file system (LTFS) being widely adopted.

Virtual machine data: “Just let me be me”
Look for a greater emphasis on simplifying backup and archive for virtual machines, particularly keeping the data in its native format. With virtual environments continuing to proliferate, there will be even greater need for virtual deduplication appliances – which eliminate duplicate copies of repeating data – as well as backup software to protect data in native format as IT managers demand faster, easier restores and portability across private and public clouds.

Managed service providers and value added resellers cloud the picture

Expect more MSPs and VARs to add cloud backup-as-a-service and disaster recovery-as-a-service to their lineup of offerings as a way to bring more value to their customers. Major storage companies will play a key supporting role in providing the underlying technologies as part of a broader effort to compete with cloud leaders such as Amazon.

I’m not getting “Nirvanixed”
Following the initial enthusiasm surrounding public cloud’s potential, it was perhaps inevitable that issues such as security and availability would attract more scrutiny. The collapse of Nirvanix, a US-based cloud storage provider, is giving some of those issues greater urgency. Companies will be more careful about weighing the cost savings benefits of public cloud backup against the slower recovery speeds, as well as concerns for their data’s security in multi-tenant clouds. Hybrid approaches that offer the best aspects of public and private clouds will have increasing appeal – particularly the benefits inherent in keeping a local copy on premise for quick recovery and assured availability.

The NSA is right: it’s all about metadata

While recent revelations about NSA spying are troubling, the agency certainly isn’t alone in recognising the increasing value of metadata. When it comes to storage, system metadata has long been important. This is the information about a data file/object that gets stored automatically such as author, size, date created, date modified, etc. In the next year, there will be growing demand to automate the collection of application metadata – information about a data file/object that relates to its content – connecting the data to its business value and usage.

Data migration is so “Old School”
Larger disk drives and petabyte-scale archives will also force the need for an alternative to traditional data migration. Migrating content with traditional RAID storage every three to five years is already painful, and waiting months or years to complete a migration is not an option for most users, to say nothing of the demands this places on IT staff to keep performance levels high. As a result, RAID will increasingly become ineffective and unmanageable, and next-generation object storage – with its self-healing and self-protecting architecture – will be adopted more broadly as a way to eliminate the need for migration.

Goodbye Nielsens, hello storage
As TV viewing continues to move online, broadcasters will increasingly be able to do their own analytics and get much more useful data. However, bringing this capability in-house will not only increase the amount of data that must be stored and protected but also require a new approach to managing data in many cases, including developing a more robust policy-based tiered archive system. At the same time, broadcasters will be looking for solutions that integrate analytics and storage.