Analysis, News

Virtualisation's dirty little secrets

THE SHEER NUMBER OF POTENTIAL missteps has Doug Dineley, executive editor of the InfoWorld Test Center (one of CPI’s US partners), shaking his head. “Virtualisation offers irresistible benefits and also the opportunity to drown.”

It can be shocking to suddenly realise that your IT staff is woefully unprepared for virtualisation and needs training. Or maybe you’ll stumble out of the gate, not knowing that it takes at least a month to get a grip on your server environment. You might be pressed to free up money to cover hidden costs or purchase new equipment – yes, new servers will likely be needed for what’s supposed to be a server consolidation project. Even if you navigate these and other pitfalls, you’ll likely be blindsided by virtualisation vendors’ over-the-top performance claims.

Server virtualisation breaks up the marriage of hardware and software (in this case, between the physical system and operating system software) and thus allows a single physical server to host many virtual servers running different operating systems. The benefits of this basic capability border on computing nirvana, not the least of which is server consolidation. For instance, IBM started moving the workload of its 3,900 servers to 30 virtualised System z9 mainframes running Linux. Big Blue expects to cut energy consumption by 80%, or more than $2 million in energy costs. Meanwhile, NetApp consolidated 343 servers to 177 via virtualisation and replaced 50 storage systems with ten new ones.

Indeed, the front lines are awash with server virtualisation success stories – and the drumbeat grows louder every day. EMC’s virtualisation high-flyer unit, VMware, raised nearly $1 billion in its IPO, based on a highly regarded product. Citrix Systems acquired server virtualisation vendor XenSource recently. And market research company Gartner has called virtualisation “the most important trend for servers through 2012”.

And Microsoft wants to shake up the virtual world with its Hyper-V, a virtual machine manager, or ‘hypervisor’, built into Windows Server 2008. As it comes ‘free’ as part of the operating system and Microsoft has integrated virtual machine management into its pantheon of management tools, virtualisation is sure to be a hit with Windows shops.

Marketing buzz aside, the truth is that server virtualisation fundamentally changes the way a data centre looks and feels and no major transformation comes easy.

 

Here’s what to look out for!

One of the great ironies of server virtualisation is that many people expect the technology to save them boatloads of money from the outset when, in fact, it often costs them more. That’s because server virtualisation demands two things: shared storage and some new servers that are powerful, richly configured and equipped with hardware memory chips.

Even if you already have these souped-up servers, you’re still not out of the woods. Server interoperability issues stymie many virtualisation journeys. According to Chris Wolf, an analyst at the Burton Group, “You cannot move a virtual machine between [Intel and AMD machines] without restarting.”

The same goes for a storage area network, or SAN. Not every SAN supports a virtualised environment. Also, existing network bandwidth may not be sufficient to handle the needs of a growing number of virtual servers. This means you’ll likely end up spending money on new servers, switches and other tech gear. Even worse, upgrade costs can offset nearly all the initial savings from decommissioned servers.

When the server virtualisation wave began to crest, industry watchers thought that the server market would be in a lot of trouble. After all, virtualisation allows people to consolidate many applications onto fewer servers – preferably existing ones. And they were partly right: Gartner believes that virtualisation reduced the x86 server market by 4% in 2006.

But it soon became apparent that you needed to strictly standardise on hardware for your virtual farm. Thus the server market remains strong and, according to IDC, is increasing.

Most people tackle hardware standardisation and server virtualisation slowly, usually when servers are due for retirement. They dabble in

noncritical areas such as print servers before moving on to e-mail applications and enterprise databases. “It’s a rolling-thunder approach,” says John Humphreys, an IDC analyst. “We’ll start to see the impact on server unit growth two, three, or four years down the road, as more people virtualise.”

IDG Research Services, a sister unit of CPI’s partner InfoWorld, surveyed 464 participants about their virtualisation experience. The biggest challenge? 44% of respondents said inadequate skills and training was the most difficult hurdle, followed by software licensing issues, performance and scaleability challenges and complexity.

So don’t expect the IT staff to have all the answers to virtualisation from the start. It’ll take at least a month to gain an accurate understanding of current server workloads, given weekly and monthly spikes, before deciding which servers can be virtualised. In small companies with only a handful of IT folks, you may need to hire – surprise! – a pricey consultant to conduct capacity planning.

A small company also may not have the necessary SAN expertise or capability to mesh Cisco switches and VMware’s complex virtual networking stack. “Virtualisation draws together so many different aspects of networking, server configuration and storage configuration that it requires a well-seasoned jack-of-all-trades to implement successfully in a small environment,” Prigge explains.

Larger companies don’t have it easier, either. Getting a lot of people in disparate teams – server, storage, business continuity, security – on the same page is a feat, especially since they traditionally don’t talk to each other very much. All of them, though, need to be educated about virtualisation. If there’s a problem with an application, for instance, an administrator must know where virtual machines exist throughout the server farm so that he doesn’t reboot a server and unwittingly take down all the virtual machines on it.

Finally, despite the hard work, virtualisation adopters may feel a sting of disappointment. Many will have embraced server virtualization with grand expectations, only to see performance fall short. Burton Group’s Wolf points the finger at vendors: “For me, the way VMware advertises performance benchmarks is completely inaccurate.”

The vendor publicity materials’ virtual machine benchmarks involve running a single virtual machine on a single physical host. But a typical production environment is conservatively eight to 12 virtual machines per physical host. “This paints an overly optimistic picture of performance,” Wolf adds. “They also tend to gloss over things like over-allocation of CPU cores” that can tax the hypervisor’s CPU scheduler and lower performance.

Memory is another big performance-buster, Wolf says, especially with virtualising multithreaded applications. When separate threads within an operating system continually try to refresh memory, the hypervisor’s shadow page tables get backed up. The result: latency. For applications that rely heavily on memory, latency spikes and application responsiveness deteriorates. Users may start seeing connection timeouts.

The fallout of lackluster performance can be huge. A company might have to fork out more cash for servers. Business execs may demand that applications be given their own servers again. “Restoring trust in virtualisation technology, it may take a couple of years before a company attempts to virtualise again,” Wolf says.

Poor performance, unprepared staff, and hidden costs are only a sampling of the pitfalls in server-virtualisation adoption. Managing the whereabouts of virtual machines can be a nightmare, given that they can be moved from one physical server to another, or even walk out the door on a portable hard disk. Security risks abound, too. Audit failures due to the lack of full separation of security zones can happen more easily in a virtual environment.

And then there’s the threat of virtual server sprawl – new applications are easy to get up and running in a virtual world. “Virtualisation increases your appetite for software,” IDC’s Humphreys says. “One company went from 1,000 applications to almost 1,300.” Not only are there potential software licensing costs involved but also the task of tagging and tracking the applications.

Of course, server virtualisation’s plethora of pitfalls won’t stop people from adopting the technology. After all, the benefits in a good implementation can be tremendous. But knowing how to identify and avoid those gotchas can make the journey more pleasant and the reward that much sweeter. There’s no slowing virtualisation down – the benefits are too great. Just watch the problem areas!

 

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines