Microsoft on Monday announced its latest software release, Windows HPC Server, at the 2008 High Performance on Wall Street Conference in New York. The application, aimed at industries like financial services, marks Microsoft’s latest entry into the high-performance computing (HPC) market.
The software is designed to give firms an easy-to-deploy, cost-effective and scalable HPC solution during a time when companies are seeking more efficiency from their IT resources without undercutting their competitive position in the market, said Bill Laing, corporate vice president of Microsoft’s Windows Server and Solutions Division.
The announcement comes in the wake of news last week that supercomputer manufacturer Cray and Microsoft have teamed to offer a deskside-sized supercomputer for less than US$60,000. Those machines will come preloaded with Windows HPC Server 2008.
HPCs on Deck
HPC Server 2008 picks up for Microsoft where Windows Compute Cluster Server 2003 (CCS) left off. CCS was the first HPC cluster technology offering from the company, designed to enable businesses to deploy multiple computers in a high-performance compute cluster in order to achieve supercomputing speeds.
Based on Windows Server 2008, HPC offers administrators simplified deployment and improved productivity of systems administration and cluster interoperability. The software will also speed application development through its integration with Visual Studio 2008.
It also supports standard interfaces, including OpenMP, multiprocessor interconnect (MPI) and Web services, along with third-party numerical library providers, performance optimizers, compilers and debugging toolkits.
How Super, Really?
The term “supercomputer” has lost a great deal of its power lately since most “high-performance computing” is done with clusters of small computers that can be indistinguishable from those running non-HPC workloads, explained Gordon Haff, an Illuminata analyst.
“Microsoft and Windows have limited presence in ‘classic’ HPC — large pools of systems in academia or national research labs,” he told TechNewsWorld.
That said, however, more and more HPC workloads are being run in regular companies that design and build products, Haff continued.
“These are mostly smaller installations than you find at a Los Alamos [National Laboratory], but they’re still huge computing resources by historic standards,” he noted.
As Microsoft owns some 90 percent of the traditional desktop computing environment and offers users as well as developers a high level of familiarity, its push into the HPC market should start with those facilities, he said.
“Whether for reasons of familiarity, developer tools, or software compatibility, these sorts of sites are often more amenable to Windows than is the case elsewhere,” Haff concluded.