Signs of the times follow patterns. In the fat years of the 1990s, we heard a lot about supersize drinks — and ballooning IT budgets. In the present belt-tightening environment, the FDA is forcing food makers to disclose harmful “trans-fat” content — and the SEC is forcing onetime high-fliers to open their books to scrutiny.
Likewise, whereas pigging out on enterprise infrastructure once was de rigueur, a drop in capital spending has heralded the arrival of new methods of building out networks in small increments, rather than gorging on unneeded technology.
Take storage networking, still a relatively healthy area of IT spending. Increasingly budget-conscious CIOs are focusing on such cost-cutting measures as utility computing, on-demand processing and virtualization. However, buzzwords alone cannot save an IT budget. Building a smart storage infrastructure requires not only careful planning, but also, increasingly, smart software. What are the steps CIOs should consider when implementing a SAN?
Focus on Process
In the simplest sense, storage area networking can improve data use by bringing live and archival storage to areas of an enterprise where it is most needed. Nonetheless, building a SAN is a complex undertaking, and ROI will not materialize if the SAN is not designed properly from the start. According to storage experts, most companies err in their SAN development by beginning projects with no clear idea of what they are trying to build.
“It’s a process issue,” said Phil Goodwin, senior program director for infrastructure strategies at Meta Group. In other words, most companies fail to evaluate existing resources properly before spending money on new technology or services.
Goodwin told the E-Commerce Times that Meta’s research shows most small networks use very little of their existing storage capacity — on the order of 30 to 40 percent at most. The unused capacity is referred to as “orphaned” or missing storage. So, the first step in properly building out a storage network is to get a handle on what one actually has. This task can be accomplished with a variety of planning tools, such as Computer Associates’ BrightStor, Tivoli’s Storage Resource Manager, Veritas’ SiteStor, Sun Microsystems’ StorEdge Resource Management Suite and EMC’s StorageScope.
Small Is Beautiful
The second line of defense, after evaluating what one has, is properly planning the buying process. “The most important thing is to bring in competition in the bidding process,” Gartner vice president Bob Passmore told the E-Commerce Times. This step is vital because in the realm of very large deals, prices can vary substantially, even for individual SAN components, such as fibre-channel switches sold by Brocade and Cisco Systems.
In fact, Passmore said the price of a simple fabric switch from one of Brocade’s system integrators can vary by as much as $1,000 per port, which can add up to a significant amount when a company is considering buying, for example, a 64-port switch.
Of course, if a company is trying to assemble smaller, more efficient systems, it makes sense to bypass pricey hardware altogether. As with most computer technology, steady commodification of storage hardware is occurring across the industry. Items that cost hundreds of thousands of dollars a few years ago cost only tens today, which may make it easier to build a SAN at an affordable price.
In addition, vendors are selling more “modular” hardware systems, according to Brad Nisbet, a senior analyst with research firm IDC. While EMC continues to market brawny storage boxes like the Symmetrix and DMX at average prices of $260,000 and $380,000, respectively, its lower-end Clariion system sells for only $90,000.
“It’s really the economic environment that’s forced vendors to build these smaller systems,” Nisbet told the E-Commerce Times, noting that many customers are investing only as needed to accommodate specific projects. Even though SEC regulations requiring record keeping have helped spur demand, in a period of sluggish GDP growth, most companies are slow to fund new projects that involve any additional computing resources.
The Old Grass Mower
However, cheap hardware is no panacea. Obsolescence is a factor, and obsolescence costs money. Gartner’s Passmore said that because the per-megabyte value of most hardware is about half that of a new system after the first few years of ownership, hardware essentially can be seen as having a useful life of less than three years. A new system will have faster drives and other components, making long-term upkeep of systems less attractive. For that reason, he noted, “you’ve really got to think about what your scalability needs might be in three years’ time” — and not necessarily expect a current purchase to meet those needs.
Of course, some vendors see this issue differently. According to EMC, the usable lifetime of hardware can be extended in a couple of ways as a SAN grows.
“Today’s storage becomes lower-level storage later,” EMC (NYSE: EMC) director of technical analysis Ken Steinhardt told the E-Commerce Times. For example, he said, EMC customers running retail operations have taken storage systems that once were attached to online transaction processing, which involves live orders, and repurposed them to serve as a data warehouse, which usually has less stringent performance requirements.
In addition to repurposing, Steinhardt said, an old hardware chassis can be upgraded with new drives. “We have a pretty good record of adopting to newer drives on existing frames,” he noted, citing the company’s ability to upgrade existing tape systems to inexpensive ATA hard disk drives. This upgrade can be done within the same cabinet, protecting older investments in hardware.
It’s the Software, Stupid
Small or large, any investment in hardware is part of a bigger picture concerning flexibility of an entire storage system. According to Meta Group’s Goodwin, the importance of any storage element is tied to how it allows storage to be placed wherever it is needed. “Direct-attached storage is the least efficient approach,” he said, because it requires a company to build out capacity at every server to which users connect. In contrast, storage networking benefits from more modular components: Storage assets can be accessed from any point throughout an office, maximizing use of any given disk array or backup system.
For example, Goodwin pointed out that with storage area networks in place, a five-terabyte storage array can be successively carved up as new departmental apps come online, rather than requiring investment in new storage each time a given workgroup needs to do something new.
Software has an even more significant role to play when it comes to the latest offerings for virtualized storage of one kind or another. All of the major players are now offering some form of pay-as-you-go storage. For example, using smart software, EMC’s OpenScale product remotely unlocks the disk capacity that ships with the box as the customer pays for extra capacity.
Software is a key element in virtualization. Although there are many different definitions of the “V” word, the basic idea is that data can be stored wherever there is available storage capacity without impacting employees’ ability to access the data they need. All of these software-driven projects can help a company maximize use of what it has, rather than overspending on unneeded capacity.
Cost alone suggests software will continue to be an important approach to finessing storage deployment. Analysts say SAN costs break down as 40 percent hardware, 40 percent labor and 20 percent software — which means that buying effective software may be the cheapest way to get the most out of a SAN. In today’s IT environment, optimizing use of existing resources is one of the best ways to keep growing without supersizing the budget — and achieving this goal may be a determining factor in a business’ survival.