Microsoft hit another high-performance computing milestone by placing its server for the first time in the top 10 on the list of the Top 500 super computers as judged by Top500.org.
Just a year ago, the best Microsoft could do was 116th place based on rankings from Top500.org, which has been benchmarking supercomputers since 1993 with its bi-annual tests it calls “runs.”
Windows HPC Server 2008, a 64-bit system that shipped Nov. 1, came in at No. 10, achieving 180.6 teraflops with 77.5% efficiency at the Shanghai Supercomputer Center and Dawning Information Industry Co.
Despite the high ranking, Microsoft's biggest high-performance computing challenge is likely in front of the vendor — creating easy-to-use developer tools for writing applications for the platform.
The company's HPC strategy is to simplify high-end computing by cutting cost and complexity, and surrounding the platform with Microsoft's collection of applications, management wares, development tools and independent software vendor (ISV) community.
Microsoft currently lays claim to less than 5% of HPC server market revenue, according to IDC. Those numbers compare with 74% for Linux and just over 21% for Unix variants.
In addition, competitors such as Red Hat have been offering its Enterprise Linux for HPC Compute Nodes since last year. IBM is also in the mix and Sun late last year re-entered the HPC fray with its Constellation System.
The next major milestone for Microsoft will come in the next year when it releases Visual Studio 2010, which was introduced last month at its Professional Developers Conference (PDC) and includes features that make it easier to design for parallel computing.
“The importance that development tools play in all of this can't be overestimated,” says Charles King, principal analyst with Pund-IT. “The money and the effort Microsoft is putting into developing Visual Studio and other tools is really critical to making this work. Clustered systems have been around quite a bit, but one reason Linux has been such a popular platform is due to the complexity of writing for environments and the easy customization of Linux allowed people in the know to get in there and design, build and tweak the system to maximize performance.”
One of the HPC-related features coming in Visual Studio 2010 is .Net Parallel Extensions, which is designed to exempt developers from having to have specialized knowledge to write parallel code. Also included for transitioning to parallel code are Task Parallel Library, Parallel LINQ and Coordination Data Structures for managed code.
Microsoft also released a preview last month of its F# language, which is a specialty language that will help developers to easily write parallel code.
Microsoft has added an SOA broker to HPC Server 2008 to aid in running cluster-enabled applications and the vendors has doubled in the past year the number of ISVs committed to its HPC platform.
With the release of HPC Server 2008 a few weeks ago, Microsoft also offered an academic version priced at $15 per node to generate interest. By comparison, a commercial license runs $450 per node.
Microsoft also recently unveiled a hardware-software partnership with Cray on the CX1 “personal” supercomputer aimed at financial services, aerospace, automotive, academia and life sciences priced at $25,000.
Microsoft also has plans to include IT in the equation. The company integrated its System Center tools for application-level monitoring and rapid provisioning by releasing on Oct. 29 the HPC Management Pack for System Center Operations Manager.
“The big discussion here is around productivity,” says Vince Mendillo, director of the HPC division at Microsoft. “It's not an OS play. We are bringing to bear all the technology to take productivity up a notch for information workers, scientists, financial analysts and others.”
Microsoft is betting users such as engineers will combine workflows running on their Windows workstations with Windows-based back-end HPC clusters, or move those workloads off the desktop altogether and into an HPC infrastructure.
Microsoft also envisions such desktop/back-end combinations as Excel users performing a function call from their desktop that in the background executes an agent that runs some computational algorithms on a networked HPC cluster and returns an answer. The user would have no concept of the back-end tied to Excel, which is widely used in financial services.