Advertisment

Smart data handling the key to storage simplicity and success

author-image
DQINDIA Online
New Update
Data Analytics

By: Saravanan Krishnan, Business Director, Infrastructure Solutions, APAC, Hitachi Data Systems

Advertisment

IDC estimates that enterprise data doubles every 18 months. That's an incredible statistic, but what does it mean in practical terms?

To illustrate the potential consequences, let’s take a quick look at the quintessentially Asian, but probably apocryphal story about the inventor of the board game, chess.

X was offered the opportunity to choose his reward by a very happy emperor and elected to take only a grain of rice, doubled square by square across the chessboard. By the time X was paid for half the chessboard, the inventor owned all the rice in the kingdom.

Advertisment

This kind of growth is called exponential. But, while the rice example is an interesting folk tale, when it comes to data, the effects can be all too real.

More than half of all data stored today is actually made up of copies – and then copies of copies – mostly created for protection from disaster. Even worse, they consume up vast amounts of expensive storage space that takes tremendous amounts of time and energy to manage.

One of the biggest challenges that IT leaders face today is how to protect important data in a more efficient, cost-effective and – above all – sustainable way.

Advertisment

Software-defined infrastructures for smarter and more efficient data storage

Frankly, the traditional ways of dealing with data storage were never intended to cope with the exponential data growth that many businesses are now experiencing.

But storing the data is only part of the equation. A better question to ask is what would you do if something bad happens and you experience a data loss? Will the right person, with the right training, be able to log into the right system to find and retrieve the right data. Will they be able to restore it to the right place, do it within prescribed service level agreements, and not break anything else along the way?

Advertisment

One of the most promising answers to these issues is a new concept that is being called software-defined infrastructure.

The beauty of software-defined infrastructure is that it can make life easier for enterprises by upgrading the existing data infrastructure at a software level instead of taking it apart/starting from scratch. This means that an IT manager who needs to duplicate data, is no longer limited by previous data storage constraints, such as certain duplication of data sets being protected by multiple tools, each creating it’s own copy of its data, retention and ownership policies.

By implementing an ability to define application requirements from the infrastructure, it then becomes possible for the infrastructure itself to automatically accommodate emerging requirements.

Advertisment

And the requirements can be staggering. For example, in the case of storage, every time a new data object is created, as many as 50 copies can be generated.

In software-defined infrastructure architecture, copy data can be kept under control by capturing new data once, then re-purposing it for multiple use cases.

The key is a unified, workflow-based policy engine that enables the data to be used for backup, archive, disaster recovery, test and development and many other purposes, all at the same time. It also automates and orchestrates storage-based copy technologies, including snapshots, clones and replication.

Advertisment

That’s the theory. What does it mean in practice?

For a start, this level of integration provides the ability to create fast, frequent copies of production data, without any impact on the performance of the production system.

And it works. Hitachi Data Systems has benchmarked the performance of a software-defined solution in real-world conditions, easily achieving aggressive recovery point objectives (RPO) for Microsoft® Exchange and Microsoft SQL Server® on Windows® platforms, and for Oracle database environments on Linux. What’s more, this approach provides the choice of leaving the archived file on the source system, deleting it, or leaving a stub file as a pointer.

Advertisment

Alternatively, users can store and manage archive data in the Microsoft Azure cloud storage service, reducing primary storage by 40% or more, which translates directly into significant cost reductions.

Conclusion

People have been talking about an information explosion for decades, however the big data explosion is really just beginning. And the time-window that organizations have available to choose the right tools to deal with the inevitable implications is shorter than ever.

A software-defined infrastructure approach offers a sensible solution to this thorny problem. And it won’t cost a business all the rice in the kingdom!

Advertisment