Most enterprise networks are the result of a patchwork effect of bringing in equipment as needed over the years to fight the fire of the day, with little emphasis on strategy and the anticipation of future requirements. That’s why it’s necessary to reevaluate network architectures in light of newer and evolving IT demands and overall moves to next-generation data centers.
Nowadays, we see that network requirements have, and are, shifting as IT departments adopt improvements such as virtualization, Software as a Service (SaaS), cloud computing and service-oriented architecture (SOA).
The network loads and demands continue to shift under the weight of Web-facing applications and services, security and regulatory compliance, governance, ever-greater data sets, and global-area service distribution and related performance management.
It doesn’t make sense to embark upon a data-center transformation journey without a strong emphasis on network transformation as well. Indeed, the two ought to be brought together, converging to an increasing degree over time.
I recently interviewed three thought leaders at HP on network transformation to help explain the evolving role of network transformation and to rationalize the strategic approach to planning and specifying present and future enterprise networks. They are Lin Nease, director of emerging technologies, HP ProCurve; John Bennett, worldwide director, data center transformation solutions, and Mike Thessen, practice principal, network infrastructure solutions practice in the HP Network Solutions Group.
Listen to the podcast (42:09 minutes).
Here are some excerpts:
Johen Bennett: Data-center transformation is really about helping customers build out a next-generation data center, an adaptive infrastructure, that is designed to not only meet the current business needs, but to lay the foundation for the plans and strategies of the organization going forward.
In many cases, the IT infrastructure, including the facilities, the servers, the network and storage environments can actually be a hindrance to investing more in business services and having the agility and flexibility that people want to have, and will need to have, in increasingly competitive environments.
When we talk about that, very typically we talk a lot about facilities, servers, and storage. For many people, the networking environment is ubiquitous. It’s there. But what we discover when we lift the covers is that you have an environment that may be taking lots of resources to manage and keep up-to-date. …
The networking infrastructure becomes key, as an integration fabric, not just between users in business services, but also between the infrastructure devices in the data center itself.
That’s why we need to look at network transformation to make sure that the networking environment itself is aligned to the strategies of the data center, that the data center infrastructure is architected to support those goals, and that you transform what you have and what you have grown historically over decades into what hopefully will be a “lean, mean, fighting machine.”
Lin Nease: The network has basically evolved as a result of the emergence of the Internet and all forms of communications that share the network as a system. The server side of the network, where applications are hosted, is only one dimension that tugs at the network design in terms of requirements.
You find that the needs of any particular corner of the enterprise network can easily be lost on the network, because the network, as a whole, is designed for multiple constituencies, and those constituencies have created a lot of situations and requirements that are in themselves special cases.
In the data center, in particular, we’ve seen the emergence of a formalized virtualization layer now coming about and many, many server connections that are no longer physical. The history of networking says that I can take advantage of the fact that I have this concept of a link or a port that is one-to-one with a particular service.
That is no longer the case. What we’re seeing with virtualization is challenging the current design of the network. That is one of the requirements that are tugging at a change or provoking a change in overall enterprise network design. …
Too often people are compelled by a technology approach to rethink how they are doing networking. IT professionals will hear the overtures of various vendors saying, “This is the next greatest technology. It will maybe enable you to do all sorts of new things.” Then people waste a lot of time focusing on the technology enablement, without actually starting with what the heck they’re trying to enable in the first place.
Mike Thessen: In years past, you were effectively just providing local area network (LAN) and wide area network (WAN) connectivity. Servers were on the network, and they got facilities from the network to transport their data over to the users.
Now, everything is becoming converged over this network — “everything” being data storage and telephony. So, it’s requiring more towers inside of corporate IT to come together to truly understand how this system is going to work together.
Nease: [Service orientation] is the only way out. With the new complexity that has emerged, and the fact that traditional designs can no longer rely on physical barriers to implement policies, we have reached a point where we need an architecture for the network that builds in explicit concepts of policy decisions and policy enforcement.
The only way out is to regard the network itself as a service that provides connectivity between stations — call them logical servers, call them users, or call them applications. In fact, that very layering alone has forced us to think through the concept of offering the network as a service. …
Bennett: In parallel with that, we see an increasing drive and demand for virtualizing storage to have it both be more efficiently and effectively used inside the data center environment, but also to service and support the virtualized business services running in virtualized servers. That, in turn, carries into the networking fabric of making sure that you can manage the network connections on the fly.
Virtualization is not only becoming pervasive, but clearly the networking fabric itself is going to be key to delivering high-quality business services in that environment. …
Thessen: Networks need to be prepared for the convergence of the communication paths for data and storage connectivity inside the data center. That’s the whole conversion — enhance, Ethernet, Fiber Channel over Ethernet. That’s the newest leg of the virtualization aspect of the data center.
Bennett: Fundamentally, convergence is about better integration across the technology stacks that help deliver business services. We’re saying that we don’t need separate, dedicated connections between servers for high availability from the connections that we use to the storage devices to have both a high-volume traffic and high-frequency traffic accesses to data for the business services or that we have for the network devices and the connections between them for the topology of the networking environment.
Rather, we are saying that today we can have one environment capable of supporting all of these needs, architected properly for particular customer’s needs, and we bring into the environment separate communications infrastructures for voice.
So, we’re really establishing, in effect, a common nervous system. Think about the data center and the organization as the human body. We’re really building up the nervous system, connecting everything in the body effectively, both for high-volume needs and for high-frequency access needs. …
Thessen: The most important thing is really still the brutal standardization — network modularity, logical separation, utilizing those virtualization techniques that I talked about a few minutes ago, and very well-defined communications flows for those applications.
Additionally, you need those communication flows especially in these SaaS or cloud-computing, or convergence environments to truly secure those environments appropriately. Without understanding who is talking to whom, how applications communicate, and how applications get access to other IT services, such as directory services and so forth, it’s really difficult to secure them appropriately. …
What we focus on is really developing a good strategy first. Then, we define the requirements that go along with business strategy, perform analysis work against the current situation and the future state requirements, and then develop the solutions specific for the client’s particular situation, utilizing perhaps a mix of products and technologies.
Dana Gardner is president and principal analyst at Interarbor Solutions, which tracks trends, delivers forecasts and interprets the competitive landscape of enterprise applications and software infrastructure markets for clients. He also produces BriefingsDirect sponsored podcasts. Follow Dana Gardner on Twitter. Disclosure: HP sponsored this podcast.