Make in-app payments easy and secure with Apple Pay. Click here to see how.
Welcome Guest | Sign In
TechNewsWorld.com

A Primer on Virtualization

A Primer on Virtualization

The term "virtualization" is actually much older than the technology running it. Today, it stands for one of several different methods and purposes of separating software from hardware. For instance, the concept is applied to three IT areas: network, storage and server virtualization.

By Jack M. Germain
03/26/09 4:00 AM PT

Virtualization is an essential component for any IT outfit looking to run an efficient, manageable, waste-not-want-not shop. It's not just a tool for large call centers with big server farms. Even small- to medium-sized businesses with growing computer demands can benefit from virtualization.

Why? Virtualization reduces costs by lowering hardware needs and repetitive maintenance tasks. Often, scheduled maintenance on multiple servers and patching dozens of workstation computers requires IT staff to work after normal business hours when the hardware can be taken offline.

Even better, virtualization does not require an army of elite, top-shelf IT experts and a large cache of spare corporate funds to deploy. Sure, there is some initial layout of funds, and it must be operated by people who know what they're doing. Still, virtualization can be implemented with out-of-the-box simplicity and a smidgeon of hand-holding from a software provider's tech support crew.

"A misconception that many of our customers bring to us is that virtualization is new technology. It is actually very old. It started back in the mainframe days of the 1950s. Virtualization from its core is actually segregating space and separating processing power in order to run separate work loads," Lew Smith, practice manager of virtualization solutions for Interphase Systems, told TechNewsWorld.

Virtualization 101

The actual term was born in the 1960s to refer to a pseudo or virtual machine. That process was created on experimental IBM mainframe computers to describe computing processes that ran in an induced virtual environment rather than directly on the metal and wires that comprised the computer. But unlike back then, the term "virtualization" no longer refers to exactly the same thing. The term does, however, ring true to its original definition.

Today, the term "virtualization" is used a bit more generically. It stands for one of several different methods and purposes for the virtualized environment. For instance, the concept is applied to three IT areas: network, storage and server virtualization. While these three categories may appear to be drastic different, they are fairly similar.

Virtualization got a foothold in the data center. From there, IT departments brought the concept to enterprise networks. Along the way, they applied related strategies to virtualization applications delivered to computers rather than the servers themselves.

"They separated the hardware and the software to eliminate hardware or appliance sprawl. Then this spread to the branch locations," Gareth Taube, vice president of marketing for Certeon, told TechNewsWorld.

Virtual Differences

If you keep in mind that the ultimate purpose behind virtualization is to conserve resources, the three categories can be viewed as merely different approaches to achieving that same goal.

Network virtualization combines the available resources in an entire network by separating the available bandwidth into differently configured channels. Each one remains independent from the others and can be physically assigned to a particular server or device in real time. The end result is that the virtualization process tricks one network into behaving as separate manageable parts. Think of the same thing applied to partitioning a large hard drive into several smaller drives with their own identifying letters and content.

Storage virtualization pools the physical storage from numerous devices into what appears to be one storage device managed from a central console. This approach is typically found in storage area networks (SANs).

Servers, Servers Everywhere

Server virtualization was an early term that was often switched with the term "platform virtualization." It masks server resources -- things like operating systems, processors and physical identity -- from the actual server users. This masking spares the user from having to understand and manage complicated details of server resources.

At the same time, the process increases resource sharing and utilization, allowing for expanded capacity. A control program, often called "host" or "guest" software, creates a simulated computer environment, or virtual machine.

Technologies differ for virtualizing data center servers versus enterprise networks, noted Hemma Prafullchandra, chief security architect at HyTrust. So matching needs to existing hardware is one of the first things to consider in selecting virtualization software.

"The virtual machine platform does not always provide application integration. The industry is still building tools to maximize use with virtualization platforms," Prafullchandra told TechNewsWorld.

Software Basics

The process of setting up a virtualized environment is not as intimidating as it may sound. Basically, it is nothing more than installing and configuring a piece of software right out of the box.

"For the basics, the hypervisor is the core. This is the software that runs on the hardware and bridges the gap between the hardware and everything you're going to run from an OS (operating system) and virtual machine perspective. It is a layer that sits on the 'metal' and brokers all the communications in and out and handles driver concerns," Smith said.

The beauty of the process is that users can run any OS on top of the hypervisor. It gives better throughput, performance and portability, he noted.

"From the server consolidation perspective, I can take any OS I'm running and dump it onto any other piece of hardware that has that hypervisor running on it," Smith explained.

More Is Better

One advantage to setting up virtualization is that users are not locked into one proprietary package. As long as the selected hypervisor runs on the hardware system, data created by the virtualized applications can be converted to another hypervisor product.

Some compatibility matrices exist, though. So you must be sure that your hardware meets the requirements of a particular hypervisor software, according to Smith.

The only precaution is in licensing, regardless of which hypervisor product is used. However, as long as legitimately obtained software is in hand, no legal entanglements exist.

"You need the license to run the virtualization layer on the hardware. You can convert the operating system from physical hardware to the virtual machine with the existing OS license. But you must remove the OS from the original hardware and remove the license key," said Smith.

Some Choices

Many of these products run on both Windows and Unix/Linux. Documentation and tech support forums are available. The choices range from open source products with fully functional free versions through paid commercial versions. Keep in mind, however, that some full-fledged proprietary products may limit the ability to convert data to other products later on.

VMWare is perhaps one of the most well-known virtualization software makers in the market. VMWare also offers virtual appliances, which are virtual machines for download, sometimes for free. VMWare products are generally compatible with the Windows and Linux platforms. There is also a version that runs on Mac OS X.

Xen is a lightweight open source hypervisor which runs on Intel or AMD x86 and 64-bit processors, with or without virtualization technologies.

Microsoft Virtual Server and Virtual PC are relatively new entrants into this software space. If you run only Windows desktops and servers, you may not need to look any further for virtualization software.

Parallels is one of the most widely used options for Mac computers. It was among the first to create commercial virtualization products that could run non-Apple OSes on Mac hosts. Parallels also runs on Windows and Linux hosts.

Other free or Open-Source choices include Qemu and FreeVPS.

VirtualBox is a general-purpose full virtualizer for x86 server, desktop and embedded hardware.


Facebook Twitter LinkedIn Google+ RSS