Network Appliance, Industry Touting Virtualization

Network Appliance’s announcement of its latest V-Series storage and management systems highlight the further spread of a trend that nearly all major hardware and software vendors tout in some form or another: virtualization.

Network Appliance said its NetApp V-series family of data storage and management systems would use “groundbreaking, dynamic virtualization” to deliver cost savings to customers with a smarter, more fluid approach to storage.

Industry analysts indicated that while virtualization — the use of many, different systems with the look and feel of a single system for efficiency and easier management — dates back to the 1970s, it is now being used more widely to link various components of the IT infrastructure, including storage, database and application servers.

“It’s not really new,” IDC Vice President Dan Kusnetzky told TechNewsWorld. “What’s new is it’s being brought to inexpensive, commodity systems rather than being a mainframe technology. It’s usually constructed of multiple systems, storage devices and resources of every type carefully woven together as if it were only one system. There are various approaches that are all attempting to leverage this illusion.”

Unique Unification

In an announcement this week, Network Appliance said its new virtualization capabilities would enable unified management in NetApp Data OnTap 7G software for a variety of storage products from HDS, HP, IBM and Sun. The company said its V-Series storage systems would allow companies to better utilize their assets by simplifying through virtualization.

“V-Series systems facilitate a single, powerful, flexible management paradigm that replaces the cumbersome management procedures of conventional arrays,” the company said.

Network Appliance Vice President of Product and Partners Patrick Rogers said in a statement that customers should demand the following from their storage vendor: dynamic virtualization of block and file data in the same device with consistent management; true multi-vendor or heterogeneous virtualization; simplified provisioning and maximized storage utilization; data management software for sharing, consolidating, protecting and recovering information; a network-based architecture enabling integration with existing storage; and the ability to scale horizontally.

Confluence, Customer Needs

IDC’s Kusnetzky — who indicated all of the major vendors have various virtualization plays that may take on as many as 14 other buzz words such as service oriented architecture (SOA) and utility computing — said the different references to virtualization do refer to something similar.

“Every major hardware and software vendor is talking about the overall architecture, and in some cases they have working software or hardware, that actually creates a virtualized environment,” he said.

Calling the current period a “time of transition,” Kusnetzky said the move to pool computing resources and data into a more manageable, single view has been underway for several decades.

“What’s new is the confluence of several trends interacting together,” he said.

He indicated that companies are further motivated to virtualize their IT systems to better meet newer regulatory requirements and added that there is a demand for more interoperability, echoing the Network Appliance reference to “multi-vendor virtualization.”

Organic IT Grows

Forrester analyst Bob Zimmerman told TechNewsWorld that while “the whole industry” uses the word virtual, “Nobody has a clue what it means.” For Forrester, the term falls under the category of “organic IT,” Zimmerman said.

He said in terms of storage, there were three types of virtualization: at the server level; through software or appliance; or in the fabric of the network.

“The idea is to put enough intelligence somewhere in the storage system purview that it can dynamically allocate resources based on application priority and where an application happens to execute,” he said.

Zimmerman said it is still “exploratory time” for virtualization technology, but added that IBM has led with its volume controller while aggressiveness from Network Appliance appears to be earning the company “a good-sized niche.”

1 Comment

  • I provided the (below) definition to some folks a couple years ago.
    I think IT fits Virtualization as does Cluster/Grid/… Beowulf/SETI/… all the way back to 1960’s. Is IT all now marketer-buzzed under "Virtualization"? If so, then I must remember when talking with PowerPoint Engineers, Hypothetical Scientist, Vapor Bosses, … at work or they’ll think I AM an idiot.
    Definition: Operating System (Global/Virtual):
    > The main control software for a computer and/or on a networked device, which schedules and responds to all tasks,
    > Allocates and/or manages all resources (hardware, communications, Input/Output Devices, Ports/Sockets, …) and/or virtual machines/environments,
    > Acts as the core (kernel) interface for software development programmers, Networks and Systems Administration folks, and the occasional expert users,
    > Allows the user population access to all tractable and AM icable applications and services on the network,
    > Provides all Intranet protocols and/or requirements for LAN Connectivity (should not be proprietary, but frequently ….),
    > Delivers open standards, protocols, and software compliance with Internet and/or WAN connectivity for any/all distributed and collaborative user communities (must never be proprietary, but … some would own the …).
    Note: By this definition, the standards, protocols, and (at least) the communications applications used on the World Wide Web (WWW) [AKA: Internet], supports the platform Independent Architecture (for Users on a virtual interoperability network). An operating system that supports online users, institutions, governments, and business enterprises applications, such as banking, shopping, VTC, VoIP, telemedicine, Disaster Relief Operations, … "The Internet as OS-Global".
    OpenContent/GPL by J.D.Bailey
    On 2003/07/23@0721EST
    The definition I used implies we are headed (totally accidental) towards the common-platform. I believe; many in the open source/standards community see this future. The way humanity gets to the future is sometimes a circuitous and confusing path, but possible destinations are discernable from the rubble of our troubles (past and present).
    This does not mean one software OSD and/or one hardware OEM; however, it does mean that the business model that built a Microsoft, IBM, GM, … may be a legacy business model of the 20th Century.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels