Things to consider for your next generation storage landscape

By: Andy Stevenson, Managing Director, Fujitsu India

The storage landscape today is being carved by the data growth, complexity and volume arising as a result of industry trends including cloud computing, enterprise mobility, analytics and social. As technologies advance and the need for more storage appliances increase, business expectations from storage only rise in terms of costs and capabilities. Technological advancements such as flash storage, integration and Software-Defined Storage maybe aimed at alleviating the situation, but choosing just the right solution to fit organisational requirements can sometimes leave one in a fix. Making a well-informed storage decision amidst the cacophony of endless demands placed of IT departments, technological advancements and industry trends can be quite challenging.

Due to the unpredictable rise of storage capacity and the costs involved, you may want a long-term solution to your storage problem by opting for a solution that has a longer shelf-life and that is built to last for the future. As per IDC’s storage briefing (11/14), the purchased storage capacity has risen by 3.5 times over the last 4 years, reaching 5 Exa Bytes in the first half of 2014. Additionally, the average cost per TB is at 500 Euros in a well-managed storage environment. Given this, below are some considerations that you can make while choosing your next generation storage solution:

Factor in the Long-Term Costs: While the initial capital for your storage solution is transparent and agreed upon, there are several cost considerations that you may want to make. These include costs involved during upgrades, operating expenditures to keep the systems running, licensing models of various vendors and training costs and overhead to administer the systems. Additionally, efficiency features in your storage systems in terms of power and cooling efficiency, consolidation of back-up tasks and efforts, automation, TSM optimization and de-duplication availability can greatly reduce your overall costs when implemented wisely.

Prepare for the Petabyte generation: Data growth is ubiquitous and exponential. The way most organisations are headed, they need to be prepared to handle data growth at the Petabyte level. Unfortunately, traditional RAID Storage Systems face their limits while crossing the Petabyte divide. At the Petabyte level, RAID solutions involve high risk and rebuild times, have extreme data migration durations, significant issues with downtime, cost-per-capacity and performance issues. In order to prepare for petabyte generation, you might want to consider in investing in hyperscale and software-defined storage which are built to address these issues and can provide organisations with increased flexibility and, automated management at reduced costs.

Use of new technologies such as flash, high capacity drives, Software-Defined: It is important to assess your organisation’s workload requirements to make the most optimum use of developments in new technologies such as flash. Today Flash storage has become more and more mainstream, thereby reducing the need for AFA and 15 rpm disks. Using flash storage where there is requirement for the high availability of data is beneficial and can increase your productivity and efficiency by a mile. Similarly using High Capacity Drives wherever appropriate can help manage growth in data volume without spending too much. Finally, investing in Software Defined Storage solutions can increase automation and reduce management time and costs.

How well does it support your business: Maintaining business continuity is another important factor to consider while choosing your storage solution. It is important to have a good disaster resiliency and back-up system in place to ensure business continuity. It will also ensure the availability of data during primary storage or site failure. A robust and automated data protection solution for your storage can additionally ensure better business management and customer security. Optimal Data Allocation in terms of Automated Storage Tiering can improve performance and reduce costs with the help of optimal drive selection & automated data allocation.

We live in an age that forces IT departments to manage two major dichotomies in the most effective manner, when it comes to storage: Firstly, getting the highest performing storage solutions at the lowest costs for business productivity. Secondly, choosing open, flexible solutions that promise longevity of your storage solution while at the same time promising security and reliability. Maintaining business connectivity while factoring in long-term costs, data growth and the use of newer technologies are some of the factors that can help ensure the most optimum use of your storage infrastructure while satisfying the two dichotomies.

Leave a Reply

Your email address will not be published. Required fields are marked *