By- Santhosh D’Souza, Director Systems Engineering, NetApp
Data storage systems have evolved through four distinct stages during their brief history. First, monolithic storage arrays gave way to SAN/NAS distributed storage networks. Next, advanced software techniques were developed that led to storage virtualization, with storage volumes abstracted from the underlying physical devices. As enterprises became increasingly data-centric, there was a realization that data architecture and data platforms play critical roles in accelerating business and reducing costs. And in the current era, cloud services have created a new model, one that combines virtualization with economies of scale, the likes of which have never seen before.
For the past few years, IT organizations have acknowledged that cloud computing delivers concrete benefits, but security and privacy concerns have largely necessitated private cloud deployments. Enterprises across the globe are shifting their business applications from dedicated infrastructures to private and public cloud environments. Interestingly, these transitions in architecture were all driven by corresponding changes in computer platforms, each of which required a new data storage architecture that could adapt to a new IT ecosystem.
Today, the economics of the cloud are driving a transition towards hybrid cloud services, and once again, storage architectures must evolve and adapt to meet the needs of this new architectural framework. The idea driving the adoption of hybrid clouds is clear – divert compute and storage resource consumption into the public cloud in order to reduce the resources required (i.e. reduce costs) in on-premise data centers.
The year 2015 will find enterprise platforms beginning to encompass hybrid cloud architectures. Every decision that puts an application in the public cloud means one more application that will never be deployed internally. The strain within IT around moving applications to the cloud will resolve as organizations recognize that a hybrid cloud model is needed to serve their application portfolio. As CIOs realize the technology benefits of the hybrid cloud, it will become more important for organizations to figure out how and where the hybrid cloud will save them money. Additionally, as products outside of the normal IT compute and storage offerings become available, IT will lose more control, causing policing and policies for hybrid cloud to rise.
We are witnessing CIOs of enterprises across the globe and in India shifting their business applications from dedicated infrastructures to private and public cloud environments. They are also making concerted efforts to extract actionable information from data, while simultaneously tackling the data deluge upfront. In a CIO survey conducted by IDG Research Services, 78% of enterprise IT organizations viewed the ability to manage data across multiple clouds as critical or very important, but only 29% of these organizations viewed their ability to do so as either excellent or good.
Without a common framework to load and move data, hybrid cloud success will remain an elusive goal. What’s needed is a way to manage, secure, protect, share, and move data among different clouds —in essence a fabric that joins on-premise equipment with numerous public clouds
Fundamentally, a data fabric is a way to manage data, both on-premise and within the cloud, using a common structure and architecture. A data fabric provides efficient data transport, software-defined management, and a consistent data format, allowing data to move more easily within and across several clouds. With data portability enabled via a connected data fabric, application servers and application data become conjoined. Here are a few of the benefits:
Economic and data governance flexibility. When using a data fabric to design new applications in the cloud, if an application project fails, simply delete the server instances and data from the cloud. If the application takes off, you can easily provide more resources to it on that cloud, move it on premise or to another cloud environment.
Better utilization of resources. Mature applications often take up data center space, power, and the resources of a skilled IT staff. A data fabric enables you to selectively move applications to a public cloud infrastructure and focus internal IT resources and mindshare on the applications that deserve attention.
Cloud-based disaster recovery. One of the most exciting capabilities of the data fabric is the ability to deliver multi-site disaster recovery (DR). SAN-to-SAN replication between data fabric end points creates hot site DR with very short recovery times, and a cloud-based, cost-effective, DR option.
As the hybrid cloud matures, a data fabric ecosystem will be required to provide a consistent framework for data movement throughout the hybrid cloud.
The hybrid cloud changes the role of IT in several significant ways. With the cloud, IT is no longer just about building infrastructure and running data centers; it’s about utilizing the tools and the applications to acquire, transform, apply, and protect the data on which business depends.
The hybrid cloud model—combining on-premise capabilities with resources and services available from various cloud providers—is poised to become the dominant model in enterprise IT, and a data fabric will enable IT organizations to take best advantage of this model.