High performance computing is not just for academicians and scientists anymore. This application of computer technology to highly complex scientific and engineering workloads is making its way into commercial settings as companies recognize its value as a competitive tool.
HPC reduces the time it takes to get results and products to market, speeds up design and analysis, and otherwise gets jobs done faster, according to Sun Microsystems. It improves product quality, increases the utilization of computing resources, and results in a greater return on investment, the company contends.
Indeed, HPC is being used in non-traditional areas like the stock market, the auto industry and pharmaceuticals, but will it find its way into the mainstream?
TechNewsWorld caught up with John Fowler, executive vice president of network systems at Sun Microsystems, to discuss why HPC is moving from the lab to theenterprise — and what is to be gained by embracing it — and other server trends in 2006.
TechNewsWorld: Academicians or scientists traditionally used HPC solutions in the past. What is changing this?
John Fowler: High performance computing has actually been a big part of the enterprise since the beginning. It just doesn’t get a lot of press. When most people think about high performance computing, they think about what I call grand challenge problems, like forecasting weather or doing complicated work at national labs. That’s a common perception, but high performance computing has been a part of technology for a long time.
Automotive design and development is a huge area in which HPC is used. Engineers can simulate crash tests, develop mechanical modeling for viability, and perform all sorts of other tasks. Another example is aviation. Boeing planes are designed and modeled on computers long before they ever fly. That’s quite a change, because 10 or 20 years ago they could only model for portions of the design.
High performance computing is also used in banking and financial services, pharmaceuticals, energy, and oil and gas. These areas just don’t get quite the same press as somebody researching a solution to a traditional laboratory problem.
TechNewsWorld: Can you offer some specific examples from these industries you mentioned?
Fowler: We work with Clemson University. The school is developing new software to help people do simulations in automotive design. In working with BMW, Clemson researchers discovered that your perception of the quality of your car depends on certain inaudible sounds that reach your ears as you drive it.
BMW used to build a car, put hundreds of microphones inside the car, drive the car around and record all of the microphone data. Then they would check the frequencies and make changes to how they designed new cars. Through a process of trial and error, they refined the sound inside the car. Clemson has developed software to completely simulate that process. So when BMW builds the car, it has actually predicted what it will sound like to you when you drive it.
If you look at financial services, like hedge fund analysis, banks are getting more and more sophisticated software to predict the movements of currency. Obviously, this kind of software is worth significant amounts of money to banks because it increases the payoff from using computers in these kinds of activities. Fraud detection is also a large area of concern. Software allows you to take massive amounts of data and look at the correlations to determine if a credit card is stolen.
Another big area is energy. All of the oil and natural gas companies do simulations to find oil reserves. Because of energy shortages, the oil companies need to go back into old fields and extract new amounts of oil. Obviously, it takes a significant amount of money to go sink a drill two miles. So they use geographic simulations to figure out exactly where the oil well is before they even put a pickaxe in the ground.
Pharmaceuticals are a huge area for high performance computing. The overall pharmaceutical industry spends more than US$52 billion a year on drug development. It is an expensive, multi-year proposition that ultimately involves human trials. Software has become sophisticated enough to predict what combination of molecules will result in a drug and what the side effects of the drug would actually be on humans. If you can use computers to eliminate years from the drug development cycle and reduce trial and error to narrow down what the drug might behave like, there is a huge payoff.
TechNewsWorld: Why is Sun so intent on pursuing HPC?
Fowler: More and more software is becoming available that allows companies to do more in high performance computing. Sun has a long history of HPC. For us, high performance computing is a clear business opportunity. Through our entire history, we have found that technology in HPC ends up going through a classic lifecycle. It gets developed, often at academic institutions, and used for these big hypothetical problems. Then it gets used by the high performance computing guys in the various businesses we talked about. Most of those technologies end up being broadly used in the enterprise over time.
TechNewsWorld: What are the barriers to enterprise adoption of HPC?
Fowler: The challenge is always going to be software. The primary issue is availability of applications to get to broader and broader segments. Also, most companies do not have the technical capabilities to run high performance computing facilities of their own. It takes a lot of power anddata center space. That’s one of the reasons why we offer our grid capability.
TechNewsWorld: Do you expect HPC to enter the mainstream of enterprise computing?
Fowler: What we just don’t realize is that high performance computing is already part of our everyday lives. I am sitting here holding a BlackBerry, and I have a Bluetooth headset. All of the components in this BlackBerry — the BlackBerry itself, the Bluetooth headset, all the semiconductors that went into it — have been built using high performance computing. Is somebody going to write an article about how high performance computing enabled BlackBerry? I don’t think so, but you do see things in your daily life that are enabled by high performance computing.
TechNewsWorld: On another note, IDC recently predicted that server virtualization would be a hot topic in 2006. Do you agree with that statement?
Fowler: Yes. People have gone through a long history of one application per box. That’s pretty common in the IT infrastructure. It means you have a lot of pieces of hardware sitting around idle that you have to purchase and maintain. Server virtualization is one of the things people are using to combine applications on a platform so that one physical piece of hardware can be used on more than one thing in a cost-effective way.
TechNewsWorld: What are the latest trends in server virtualization? Will we see anything new?
Fowler: The primary trend we are seeing is the move toward bigger, more powerful machines. When you can combine several applications on a single machine, you gain some cost efficiency there, so interest in bigger machines goes up. We certainly have seen that ourselves, and it will be a trend in 2006. Instead of one- and two-processor machines, people will more often want four- and eight-processor machines, because there is an economy of scale. That’s something that will happen in the next six to eight months.