News

5 must-have IT management technologies for 2010

As more companies expand virtualization deployments and consider cloud computing, the average IT environment will grow ever more complex. For enterprise IT managers in 2010, that means they must update the technologies they use to monitor, manage and optimize the environment.

Management must-dos in 2009

Industry watchers say some of the biggest challenges facing IT organizations in 2010 are more cultural than technical. From breaking the trend of working in domains, or silos, and aligning IT services better with business needs, IT departments face many formidable tasks in 2010. Virtualization and cloud computing, for instance, require actions be taken across IT domains and will push IT organizations to break down such barriers to new technologies.

“A big limitation today in achieving the true value of some of the latest tools is IT organization, especially in enterprises. Enterprises work in silos, not only between different domain areas (for instance, network, application, server, desktop and storage) but also within domain areas such as Linux server management, mainframe management, Windows management and virtualization management,” says David Williams, research vice president at Gartner. “This situation is understood and is slowly starting to be addressed with new roles and cross-domain teams being established. In 2010, IT organizations will continue to visit how they are organized to allow IT operations to become more service-centric and business-aligned.”

Analysts say if the cultural hurdles can be cleared, a handful of tools will make adopting advanced technologies in 2010 easier for the majority of IT departments. Here is a brief look at five technologies industry watchers say could become mandatory for optimized IT service delivery and advanced data center operations in the coming year.

No. 1: IT service assurance

Managing the performance of IT service delivery involves myriad technologies reporting on various perspectives, including the user experience with an application.

That means IT departments need to be able to get visibility into network traffic flows as well as application performance across multiple components supporting IT services. From advanced discovery technology to traffic flow analysis to transaction monitoring, IT departments need to see the entire path of a service — even as it exits in the corporate network and travels through external cloud environments, for instance.

The premise of IT service assurance isn't entirely new and until recently was more commonly a concern for service providers, but enterprise IT organizations have started to evolve into service providers in their own right. Companies such as BMC, CA, HP, IBM and now EMC are touting the ability to provide insight into the life cycle of an IT service. The speed at which companies are adopting and expanding their use of virtualization and the growing interest in internal and external cloud computing environments heightens the need for such technology in 2010.

“One need is to have a true end-to-end picture, which means having comprehensive visibility and control into the quality of experiences for the end user and the quality of service,” says Evelyn Hubbert, a senior analyst with Forrester Research. “This means we need to see how traffic flows across the network, systems, applications and databases, which all are participating in the services. IT organizations realize they need to manage the services rather than the infrastructure.”

No. 2: Virtual systems management

Vendors looking ahead to 2010 seemed to get quiet on virtual systems management in favor of touting cloud management capabilities, but industry watchers say that without support for heterogeneous virtual systems and advanced features covering performance and capacity management, there can be no cloud management.

“Virtualization and automation technologies are directly related to the cloud. Virtual servers comprise the computing environment, and automation is responsible for the cloud being monitoring, management, secured and made compliant,” says Andi Mann, research director at Enterprise Management Associates. “Virtualization is fundamentally mainstream now, and there is a lot of activity around virtual systems management. Niche players are expanding support beyond VMware and enabling their technology to cover more of the enterprise.”

Virtual systems management in 2009 became mandatory for vendors and in 2010, enterprise IT organizations will be certain to equip their toolboxes with multi-hypervisor virtual system support. And while companies such as VKernel, Surgient, Fortisphere, ManageIQ, Embotics and Veeam burst onto the scene with VMware management capabilities in the past few years, analysts say expect to see some of these innovative newcomers get acquired as the larger systems management players look to fill gaps in their product portfolios. (See related story, “10 IT management technology start-ups to watch.”)

“Expect many acquisitions, especially in the virtualization management space, in which smaller vendors are due for consolidation,” says Mary Johnston Turner, research director at IDC.

No. 3: IT service catalog

As IT departments start to optimize service delivery, they will also be improving how they communicate the services they offer to the user community. Analysts say the trend toward putting available IT services into an easily digestible Web-based IT service catalog will explode in 2010 as IT organizations streamline processes and better align their efforts with business demands.

“Service catalogs are very useful, but with cloud adoption, they become fundamental,” says EMA's Mann. “IT organizations realize they must be able to communicate to end users what they are allowed to get and at what frequency and in some cases for how much. It is hard to imagine broad cloud computing adoption without an IT service catalog.”

The premise of an IT service catalog isn't new, but industry watchers suggest this past year's recession could have given new life to efforts around identifying, describing and publishing a list of IT services for users to consume. Best practice frameworks such as ITIL lay out how IT departments could establish an IT service catalog, and vendors such as BMC, Digital Fuel, newScale, Oblicore and PMG have also developed products to help IT departments create catalogs.

“People are trying to get their service catalogs in order so that their customers know what they can get and how much it will cost. This enables IT operations to structure their people and organize their work around the actual demand,” Forrester's Hubbert says.

No. 4: IT process automation

Virtualization and cloud computing initiatives will rely on automation more than any other technology. Analysts say IT process automation has already become a mandatory tool for companies deploying virtual servers, but add to that the potential of cloud computing services and IT organizations simply cannot function without automation.

“People need automation for everything from provisioning virtual servers to auditing environments to ensuring consistent configurations. On the monitoring side, automation will be able to keep up with the pace of virtual environments and recognize when changes happen in ways a human operator simply could not,” says Jim Frey, research director at EMA. “Automation will even be used to perform analytics and help find potential problems before they harm IT service delivery in these environments.”

IT process automation is expected to be in such demand that companies that aren't directly focused on management software are even putting money into the technology. For instance, Microsoft recently acquired IT process automation player Opalis and analysts say the vendor realizes how critical automation will be to IT organizations in 2010. (See related story, “10 big IT management moves in 2009.”)

“IT process automation is a real needed technology and it becomes more important when you talk about virtual systems because virtualization requires rapid responses, it requires things be done at automation speed, not human speed,” Mann says. “Microsoft was one of the vendors in the dark on automation so this acquisition gives the vendor a chance to extend automation to Azure and other cloud environments, because cloud computing requires a level of workflow and orchestration that Microsoft could not have done well in the short-term on its own.”

No. 5: IT resource planning

IT organizations already using virtualization and hoping to explore cloud computing will also need to adopt IT resource planning processes and ultimately technologies in 2010. Combining tenets of capacity planning and financial management as well as usage and service measurement, IT resource planning will enable IT departments to understand how services are being consumed and ensure that even in the most dynamic environments they can respond quickly to business demand.

“Capacity planning today is all about trying to ensure that you have enough capacity and memory cycles to meet workload demand. But virtualization causes new variables to be taken into consideration, and power consumption is just one among many,” says Cameron Haight, research vice president at Gartner. “For IT resource planning (ITRP) there are several more elements to consider and the process must become much more strategic within an enterprise.”

Gartner analysts earlier this year detailed in a report the many variables that must be taken into account for appropriate enterprise IT resource planning. Traditional IT capacity metrics need to be considered alongside business requirements, human capital, financial metrics, facilities and power data, risk and compliance information as well as workload placement. Other considerations include configuration management, asset management, change management, event management and performance management, according to the Gartner report.

“I can imagine the day when the IT capacity planner will have to look at the resources they need on any given day and consider if they should turn to in-house sources or external providers to get the best overall value,” Haight says. “It is not that far from reality that IT organizations will need to get the tools in place to quickly analyze the cost, quality and performance of services from multiple sources and select the best option on a case-by-case basis.”

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines