The growing interest and value in PC desktop virtualization strategies and approaches has its roots in both technology and economics. Recently, a lot has happened technically that has matured the performance and economic benefits of desktop virtualization and the use of thin-client devices.
At the same time as this functional maturity improved, we are approaching an inflection point in a market that is accepting of new clients and new client approaches like desktop virtualization.
Indeed, the latest desktop virtualization model empowers enterprises with lower total costs, greater management of software, tighter security, and the ability to exploit low-cost, low-energy thin client devices. It’s an offer that more enterprises are going to find hard to refuse.
In desktop virtualization, the workhorse is the server, and the client assists. This allows for easier management, support, upgrades, provisioning and control of data and applications. Users can also take their unique desktop experience to any supported device, connect and pick up where they left off. And there are now new offline benefits too.
Here to help us learn more about the role and outlook for desktop virtualization, we’re joined by Jeff Groudan, vice president of Thin Computing Solutions at HP. The BriefingsDirect interview is conducted by Dana Gardner, principal analyst at Interarbor Solutions.
Listen to the podcast (29:23 minutes).
Here are some excerpts:
Jeff Groudan: There certainly are some things in the market that are sure driving a potential inflection point [for client virtualization]. The market-driven things coming out of the recession are opening a lot of customers up to re-looking at some deployments that they may have delayed or specific IT projects that they have put on hold.
Just to put it into context, there was recently some data from Gartner. They feel like there are well over 600 million desktop PCs in offices today. Their belief is that over the next five years, upwards of 15 percent of those could be replaced by thin clients. So that’s quite a number of redeployments and quite an inflection point for client virtualization.
In addition, there has been an ongoing desire to increase security and a lot of new compliance requirements that the customers have to address. In addition, in general, as they are looking for ways to save on costs, they are consistently and constantly looking for different ways to more efficiently manage their distributed PC environments. All of these things are driving the high level of interest in virtualizing PCs.
One of the key benefits of client virtualization is the ability to keep all the data behind the firewall in the data center and deploy thin clients to the edge of the network. Those thin clients, by design, don’t have any local data.
You’re also seeing better performance on the hardware side and the infrastructure side. It’s really also helping bring the cost per seat of the client virtualization deployment down into ranges that are lot more interesting for large deployments. Last, and near and dear to my heart, you’re seeing more powerful, yet cost-effective, thin clients that you can put on the desk and that really ensure those end-users get the experience that you want them to get.
Not a Panacea
Our general coaching to customers is that client virtualization is not necessary for everyone, for every user group, or every application set. But certainly for environments where you need to get them more manageable, you need more flexibility.
You need higher degrees of automation in order to manage a high number of distributed PCs with the benefits from centralized control, reduced labor costs, and the ability to manage remote or hard to get-at locations — things like branches, where you don’t have a local IT. Those are great targets for early client virtualization deployments.
All of a sudden, the data-center guys need to be thinking about the end-user. The end-user guys need to be thinking about the data center. Roles and responsibilities need to be hammered out. How do you charge the capital expense versus operational expense? What gets budgeted where? My advice is, as you’re thinking about the technical architecture and all of the savings end-to-end, you need to also be thinking about the internal business processes.
We look at this market in two ways, in the context of client virtualization and in the broader context of thin computing. Just zeroing in on client virtualization, we call it “Client Virtualization HP.” It’s desktop virtualization. It’s the same animal.
We look it as a specific set of technologies and architectures that dis-aggregate the elements of a PC, which allows customers to more easily manage and secure their environment. What we’re really doing is taking advantage of a lot of the new software capabilities that matured on the server side, from a server virtualization and utilization perspective. We’re now able to deploy some of those technologies, hypervisors, and protocols on the client side.
The first is that you don’t want to have customers having to figure out how to architect the stuff on their own. If you think about PCs 20, 25 years ago, customers didn’t know how to architect a distributed PC environment. In 25 years, everybody has gotten good at it. We’re still at the early stages on client virtualization.
Our specific objective is figuring out how to simplify virtualization, so that customers get past the technology and really start to deliver the full benefit of virtualization, without all the complexity.
So our focus is to deliver more complete integrated solutions, end to end from the desktop to the data center, lay it all out, and reference designs so customers can very comfortably understand how to go build out a deployment. They certainly may want to customize it. We want to get them 80 to 90 percent there just by telling them what we have learned.
Wide Applicability Across Industries
There are opportunities for just about every industry. We’ve seen certain verticals on the cutting edge of this. Financial services, healthcare, education, and public sector are a few examples of industries that have really embraced this quickly. They have two or three themes in common. One is an acute security need. If you think about healthcare, financial services, and government, they all have very acute needs to secure their environments. That led them to client virtualization relatively quickly.
We certainly have some very exciting launches coming up in the next couple of months where we’re really focused on total cost per seat. How do we let people deploy these kinds of solutions and continue to get further economic benefits, delivering better tighter integration across the desktop to the data center?
The ease of deployment of these solutions can get easier-and-easier, and then ease of use and manageability tools. They allow the IT guys to deploy large deployments of client virtualization with as little touch and as little complexity as we can possibly make it. We’re trying to automate these kinds of solutions. We’re very excited about some of the things we’ll be delivering to our customers in the next couple of months.
Dana Gardner is president and principal analyst at Interarbor Solutions, which tracks trends, delivers forecasts and interprets the competitive landscape of enterprise applications and software infrastructure markets for clients. He also produces BriefingsDirect sponsored podcasts. Follow Dana Gardner on Twitter. Disclosure: HP sponsored this podcast.