While the notion is true that cloud infrastructures is poised to
revolutionize the way we access technology, it is also true that it is a vast
area and enterprises often grapple with understanding its full potential. The
ever increasing complexity of data centers coupled with the massive growth of
data is leading to an increase in resources required to store, process, optimize
and serve information back to end users when required. Organizations have
rapidly adopted virtualization and are now looking at cloud delivery service
models to reduce costs, increase flexibility and improve their time to market.
To keep pace with todays competitive marketplace, enterprises require
greater flexibility and agility than their traditional storage architectures can
provide. This, coupled with explosive unstructured content growth, is driving
organizations to look toward instant IT delivery models.
On-demand IT Delivery
In the move towards on-demand IT delivery, it hasnt taken long for
customers to recognize the benefits of private, hybrid or public cloud models,
the most tangible being significant cost savings. From a capital expenditure
perspective, organizations tend to over-purchase to deal with the ebb and flow
of storage and resource requirements to support the business. This often leaves
them with an abundance of underutilized hardware assets. The ability of cloud
infrastructures to grow and contract storage resources, in tune with the
business needs, minimizes this upfront capital expense, moving them from fixed
costs to variable costs.
If we take a look at the operational costs, hidden costs, the data growth
and complexity to manage traditional IT environments has emerged. Customers can
reduce much of this operational expenditure by deploying cloud models and
leveraging cloud managed services, paying only for what they consume and
eliminating the day-to-day management tasks altogether.
While on-demand access to computing resources is what the industry is
striving for, it also poses concern and risk for IT organizations. This can be
very disruptive to business process and control. If business users begin to
outsource to cloud providers in order to get faster support, sensitive
information could be put at risk. To mitigate this risk, IT organizations should
be thinking about developing an internal cloud enabled architecture to provide
greater business agility, in addition to a process for ad hoc projects that
require to be outsourced to a public or hybrid cloud provider. This way they can
move into the cloud in a controlled and orderly manner mitigating associated
risks.
There are many deployment choices to consider for moving into the cloud.
Private cloud: For simplicity, lets define a private cloud as cloud
enabled infrastructure within the physical walls of a data center. A private
cloud can provide many of the benefits of cloud without the security risks
associated with public deployments. Because it is accessed over an internal
network or intranet, its as secure as the rest of the data. Since one controls
it and the environment around it (ie networks, servers, etc), one can achieve
enterprise level SLAs. But one do sacrifice some of the operational cost savings
such as physical floor space, power, and cooling. Unless one is leveraging a
managed service one is also subject to management overhead.
Hybrid cloud: Now lets take a look at the hybrid or trusted cloud,
which we will define as infrastructure that resides at a trusted service
provider. In this case, access is limited to appropriate resources in ones
organization and delivered over a virtual private network or a secure Internet
connection. Since the infrastructure is out of the organizations direct
control, service levels could be impacted by external factors. Customers also
need to think about the physical security of the environment, which is why it is
important to understand the service providers processes and requirements around
physical access.
Public cloud: Lastly, the public cloud can be described similarly to
the hybrid, except that there is usually more general access over the Internet
providing limited security. Many public cloud offerings are quite inexpensive or
sometimes even free and SLAs are generally not guaranteed or measured
differently than how an enterprise measures their SLAs. Additionally, value
added services and features such as encryption, compression, back-up, tiering
and replication are not available from public providers as they are from private
or hybrid cloud providers.
What Makes a Cloud?
Regardless of the type of cloud, there are some key features every cloud
platform should have. Firstly, it is a secure, direct connection to get data
into the cloud, such as a representational state transfer (REST) interface or an
on-ramp to connect applications to the cloud without requiring application
recoding. REST is an approach for getting information content from a website by
reading a designated web page that contains an Extensible Markup Language (XML)
file that describes and includes the desired content. There also needs to be
multitenancy capabilities to logically segregate the data, so that SLAs can be
assigned to specific data types or applications. The cloud should also have
namespaces with access rights and security layers to prevent unauthorized
access. Depending on the provider, some clouds offer value added features like
compression and single instancing to improve cost savings, encryption to provide
greater security and billing and chargeback for organizations or service
providers that wish to bill each business unit or organization based on
consumption.
IT services are generally held to service level standards for availability,
reliability and integrity. Augmenting or replacing legacy IT services with
cloud services require the same quality guarantees. Although, managed cloud
services allow customers to focus less on the storage management it is critical
that customers include an expected quality of service in their contracts with
cloud providers. Not all cloud providers measure SLAs the same way, so its
important to ask how they quantify their SLAs.
Given some of the trade-offs between the various cloud deployment models, how
does one identify the most appropriate candidates for deployment? One should
start by identifying the data in ones environment that generally has lower
business value and lower SLA requirements. For example, think of data types like
home directory shares, static data or backup content that can be moved from
onsite primary to cloud secondary storage.
One can get immediate cost savings by moving this peripheral datadata that
doesnt require active management or constant read/write access to the cloud. It
is unnecessary to pay such high administrative and management overhead for this
non-business critical data. frees up resources to focus on the core business
applications, improving operational efficiency and utilization of ones existing
assets. Next, it allows ones organization to gain experience and develop best
practices for cloud deployments. Lastly, it allows one to move toward the core,
tier-1 applications at ones own pace.
Sunil Chavan
The author is director, software group & cloud solutions, Asia Pacific,
Hitachi Data System
maildqindia@cybermedia.co.in