A Primer on Virtualization

Virtualization is an essential component for any IT outfit looking torun an efficient, manageable, waste-not-want-not shop. It’s not justa tool for large call centers with big server farms. Evensmall- to medium-sized businesses with growing computer demands canbenefit from virtualization.

Why? Virtualization reduces costs by lowering hardware needs andrepetitive maintenance tasks. Often, scheduled maintenance on multipleservers and patching dozens of workstation computers requires IT staffto work after normal business hours when the hardware can be taken offline.

Even better, virtualization does not require an army of elite, top-shelf IT experts and a large cache of spare corporate funds todeploy. Sure, there is some initial layout of funds, and it must be operated by people who know what they’re doing. Still,virtualization can be implemented with out-of-the-box simplicity and asmidgeon of hand-holding from a software provider’s tech support crew.

“A misconception that many of our customers bring to us is thatvirtualization is new technology. It is actually very old. It startedback in the mainframe days of the 1950s. Virtualization from its coreis actually segregating space and separating processing power in orderto run separate work loads,” Lew Smith, practice manager ofvirtualization solutions for Interphase Systems, told TechNewsWorld.

Virtualization 101

The actual term was born in the 1960s to refer to a pseudo or virtualmachine. That process was created on experimental IBM mainframecomputers to describe computing processes that ran in an inducedvirtual environment rather than directly on the metal and wires thatcomprised the computer. But unlike back then, the term”virtualization” no longer refers to exactly the same thing. The termdoes, however, ring true to its original definition.

Today, the term “virtualization” is used a bit more generically. It stands for oneof several different methods and purposes for the virtualizedenvironment. For instance, the concept is applied to three IT areas:network, storage and server virtualization. While these threecategories may appear to be drastic different, they are fairlysimilar.

Virtualization got a foothold in the data center. From there, ITdepartments brought the concept to enterprise networks. Along the way,they applied related strategies to virtualization applicationsdelivered to computers rather than the servers themselves.

“They separated the hardware and the software to eliminate hardware orappliance sprawl. Then this spread to the branch locations,” GarethTaube, vice president of marketing for Certeon, told TechNewsWorld.

Virtual Differences

If you keep in mind that the ultimate purpose behind virtualization isto conserve resources, the three categories can be viewed as merelydifferent approaches to achieving that same goal.

Network virtualization combines the available resourcesin an entire network by separating the available bandwidth intodifferently configured channels. Each one remains independent from theothers and can be physically assigned to a particular server or devicein real time. The end result is that the virtualization process tricksone network into behaving as separate manageable parts. Think of thesame thing applied to partitioning a large hard drive into severalsmaller drives with their own identifying letters and content.

Storage virtualization pools the physical storage fromnumerous devices into what appears to be one storage device managedfrom a central console. This approach is typically found in storagearea networks (SANs).

Servers, Servers Everywhere

Server virtualization was an early term that was often switched withthe term “platform virtualization.” It masks server resources — thingslike operating systems, processors and physical identity — from theactual server users. This masking spares the user from having tounderstand and manage complicated details of server resources.

At the same time, the process increases resource sharing andutilization, allowing for expanded capacity. A control program, oftencalled “host” or “guest” software, creates a simulated computerenvironment, or virtual machine.

Technologies differ for virtualizing data center servers versusenterprise networks, noted Hemma Prafullchandra, chief securityarchitect at HyTrust. So matching needs to existing hardware is one ofthe first things to consider in selecting virtualization software.

“The virtual machine platform does not always provide applicationintegration. The industry is still building tools to maximize use withvirtualization platforms,” Prafullchandra told TechNewsWorld.

Software Basics

The process of setting up a virtualized environment is not asintimidating as it may sound. Basically, it is nothing more thaninstalling and configuring a piece of software right out of the box.

“For the basics, the hypervisor is the core. This is the software thatruns on the hardware and bridges the gap between the hardware andeverything you’re going to run from an OS (operating system) and virtual machineperspective. It is a layer that sits on the ‘metal’ and brokers all thecommunications in and out and handles driver concerns,” Smith said.

The beauty of the process is that users can run any OS on top of thehypervisor. It gives better throughput, performance and portability,he noted.

“From the server consolidation perspective, I can take any OS I’mrunning and dump it onto any other piece of hardware that has thathypervisor running on it,” Smith explained.

More Is Better

One advantage to setting up virtualization is that users are notlocked into one proprietary package. As long as the selectedhypervisor runs onthe hardware system, data created by the virtualized applications canbe converted to another hypervisor product.

Some compatibility matrices exist, though. So you must be sure thatyour hardware meets the requirements of a particular hypervisorsoftware, according to Smith.

The only precaution is in licensing, regardless of which hypervisorproduct is used. However, as long as legitimately obtained software is inhand, no legal entanglements exist.

“You need the license to run the virtualization layer on the hardware.You can convert the operating system from physical hardware to thevirtual machine with the existing OS license. But you must remove theOS from the original hardware and remove the license key,” said Smith.

Some Choices

Many of these products run on both Windows and Unix/Linux.Documentation and tech support forums are available. The choices rangefrom open source products with fully functional free versions throughpaid commercial versions. Keep in mind, however, that some full-fledged proprietary products may limit the ability to convert data to other products later on.

VMWare is perhaps one of the most well-knownvirtualization software makers in the market. VMWare also offers virtualappliances, which are virtual machines for download, sometimes for free.VMWare products are generally compatible with the Windows and Linux platforms. There isalso a version that runs on Mac OS X.

Xen is a lightweight open sourcehypervisor which runs on Intel or AMD x86 and 64-bit processors, withor without virtualization technologies.

Microsoft Virtual Server and Virtual PC are relatively new entrants into this software space. If you run onlyWindows desktops and servers, you may not need to look any further forvirtualization software.

Parallels is one of the most widely used options for Maccomputers. It was among the first to create commercial virtualization products thatcould run non-Apple OSes on Mac hosts. Parallels also runs on Windows and Linuxhosts.

Other free or Open-Source choices include Qemu and FreeVPS.

VirtualBox is a general-purpose full virtualizerfor x86 server, desktop and embedded hardware.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels