Features

Four more years

Even the best storage products must eventually be retired – a fact that many belt-tightening CIOs are at pains to acknowledge. However, arguments abound that a well-defined product lifecycle, with regular updates, can actually save more money than struggling on with ageing storage. CNME gets to the bottom of the matter.

Some things are inevitable every four or five years. Certain countries’ political elections, for example. Or World Cups and Olympic Games, for the sports fans out there. For CIOs, however, it signals the time for a technology refresh.

The differences aren’t perhaps as drastic as you would expect, especially between a national election and a technology refresh. Both often involve a realisation that something – whether it be a government or an enterprise’s storage – has grown tired and needs to be replaced, and not doing so could hinder processes, as well as the ability to leverage new opportunities for growth.

But a key difference between the other aforementioned examples is that whilst they all generally generate a lot of excitement, the same, unfortunately, cannot be said for the storage updates.

That said, it is still an inevitable undertaking, and one that could have significant implications if not done because a CIO is trying to dodge the costs.

“In the few cases where organisations delay their technology refresh activities in the name of cutting costs, the implications are twofold,” says Zaher Haydar, Regional Pre-Sales Manager, Turkey, Emerging Africa and the Middle East, EMC.

“Holding onto ageing technologies obviously results in these organisations missing out on the ability to optimise their IT infrastructure, which in turn hinders agility and efficiency. In addition, these firms will report higher TCOs than organisations that invest in technology refreshes on account of maintenance and service costs in addition to other operational expenses.”

For instance, Haydar continues, a firm with six- to seven-year-old storage systems will not have the dynamic tiering capabilities of technologies that are available today.

“These older systems operate on smaller disks, consume more power and physical space, which ultimately drives OPEX higher. A technology refresh will allow these firms to deploy newer storage technology that are not only more energy efficient but also more compact, saving physical space with larger disks and boosting performance of the overall infrastructure by leveraging flash (SSD) technologies.”

In general, a storage refresh improves the efficiency of storage infrastructure by up to 40 percent, Haydar claims.

Savitha Bhaskar, General Manager, Condo Protego, points to more examples of why holding off a refresh to cut costs can be a misguided approach.

“There are considerable benefits for businesses that refresh in an appropriate, timely manner,” she says. “For example, day-to-day operations are likely to be far more efficient, which can save both time and money.

“Although the enterprise-class technologies are built to last a lot longer than the average three-year warranty, ageing equipment tends to have a higher rate of component failures resulting from simple wear and tear. There are also considerable cost implications for maintenance of out-of-warranty equipment, and the obvious reputational hits that result from business downtime or an inability to handle increasingly complex operations or data loads on older technology storage platforms.”

Lease resistance

The age of ‘IT transformation’ has changed the relationship between CIOs and CEOs, which means the age-old tussle of the CEO discouraging a refresh because the current technology appears to work fine, has become less of an occurrence.

Senior management have recognised in recent years the growing role of IT as a business driver that can leverage competitive opportunities for revenue growth and profitability. As such, CIOs should hopefully receive less resistance from topline-obsessed CEOs.

However, CIOs are under new pressure to deliver IT-as-a-service and provision cloud-based architectures to counter external competition from service providers, which constantly offer new services like software-as-a-service and storage-as-a-service, Haydar says.

“Today, technologies that enable this transformation model are part of a company’s strategic roadmap,” he says. “For instance, while IT-as-a-service optimises resource utilisation, agility, and efficiency, big data and real-time analytics enable new opportunities for competitive differentiation and revenue growth. Today, CEOs invest in IT to further drive their business.”

Bhaskar agrees it shouldn’t be too difficult for CIOs to present a compelling business case as to why upgrades are necessary, but believes the pressure that they feel to adapt to the exponential growth of data is rapidly growing.

“If you can’t scale, you are going to get left behind,” she says. “Playing catch-up is likely to be far more expensive and damaging than staying ahead of, or at least keeping pace with, the curve.

“A tech refresh should not be seen as an irritating expense, but a chance to change operations for the better, to eliminate past mistakes and then build a platform for better-run, more profitable business.”

Fortunately, the shift towards software-defined data centres (SDDCs) is expected to make the whole technology cycle a lot easier, as CIOs begin to see hardware technology of different generations co-exist with newer software solutions.

“SDDCs, where the hardware is standardised and disassociated from software that holds the intelligence, will see us invest in fewer hardware refresh cycles and more software updates,” Haydar predicts of the future.

Breaking the mould

The way IT strategies are planned, most CIOs can foresee upcoming changes in technology and map them in accordingly. They are therefore in a position to predict the advent of any transformational technologies and plan their refresh updates in due time to leverage the same.

However, the advent of SDDCs make it even easier for CIOs to embrace new technologies without breaking the lifecycle mould, as they enable an easier integration of newer solutions with older systems without disturbing the overall infrastructure.

This is especially significant when considering the increasing impact that big data will inevitably have on storage lifecycles.

Big data is posing a big storage challenge for businesses across the Middle East, as businesses now need to invest not just in the right systems to store all this data but also derive business value to drive competitive advantages in the long run.

IDC predicts that the global big data market will grow 40 per cent per year, which is around seven times as fast as the rest of the IT industry. Most of that cost will come from infrastructure-investment-calibre storage projects that are set to drive spending in the storage market to growth rates above 61 per cent through 2015.

IDC also expects the digital universe to reach 40 zettabytes (ZB) by 2020, an amount that exceeds previous forecasts by 5 ZB, resulting in a 50-fold growth from the beginning of 2010. And it anticipates emerging markets to supplant the developed world as the main producer of the world’s data.

Furthermore, with big data sets growing by an average of 60 per cent per year or more, based on IDC figures, business research specialists Aberdeen Group suggest that many companies will have to double the volume of their data storage every two-and-a-half years just to keep up.

“As the volume of data continues to grow exponentially, businesses need to revise their lifecycle management policies to stay relevant and derive the maximum business benefit,” Haydar says.

To help with this process, information lifecycle management (ILM) has emerged as another approach to enterprise storage that is designed to align business needs and storage practices by basing storage infrastructure decisions largely on the value of information.

For example, by storing less valuable information on less expensive storage infrastructure, ILM promises economic benefits while maintaining sufficient access to information and acceptable service levels for enterprise applications.

“By definition, the rate of big data growth exceeds the capabilities of traditional IT infrastructures and represents largely greenfield computing and data management problems for customers,” Haydar says.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines