Computer technology in the workplace is all about hardware speed and connection bandwidth. To help IT managers augment their systems in both regards, vendors have been developing tweaking strategies to boost application speed beyond design limits. This process is called “application acceleration.”
However, this race has changed quite a bit over the past few months. Some vendors jumped ship while others changed strategy entirely. By and large, customers are screaming for more simplicity across the board when it comes to application acceleration.
Until now, the mainstay of the application acceleration industry was rooted in the appliance. This hardware served to hypercharge the use of data within a program. Now, other vendors have concocted ways to hype the speed factor without adding an appliance. For instance, managed services, consolidated appliances, virtualization and other new approaches to application acceleration attempt to eliminate the need for an appliance all together.
However, there is one simple approach to application acceleration that is perhaps being overlooked by all but a small handful of innovators. It’s a huge labyrinth of a network available to the public via the Internet as a viable network for B2B (business-to-business) application acceleration. Concerns over security issues and reliability in remote locations previously kept adopters away. However, those challenges are solvable, according to Akamai, which developed its own network for B2B application performance solutions.
“This network is a unique approach to provide a managed service rather than a hardware distribution,” Willie Tejada, vice president of application acceleration for Akamai, told TechNewsWorld.
As a workforce moves from a centralized corporate headquarters to branch outposts, companies try to distribute essential data to the workers. To make this work, corporate IT consolidates their servers to economize. That’s when the problem becomes evident. The wireless area network gets in the way.
“Distributing data over global distances causes trouble. The data has to contend with limited bandwidth. It is also subjected to latency and packet loss,” Gareth Taube, vice president of worldwide marketing for Certeon, told TechNewsWorld. Certeon provides application acceleration appliances.
The first wave of solving this problem originally focused on network-centric strategies. Companies compressed the packets and prioritized traffic on the network. That left them with no more room for optimization, Taube said.
Companies then began dealing on the application level.
“This enables them to do a better job of what the application is trying to do,” he explained.
Application Acceleration Theory
In order for hardware vendors to quicken data flow at the application level, engineers had to approach the appliance as though learning a foreign language. They needed the equivalent of an application-level dictionary, according to Taube. That’s what Certeon and a few other vendors have started doing. They look at an application and become fluent in it.
Each component in Microsoft Office, for example, has an identifiable object associated with it. If the smart appliance can identify an application’s object, it can know what the user is going to do. For example, a PowerPoint user is going to edit a file and send it to another branch office.
“But the process only involves having to send back about 10 percent of the edited file. So the appliance can very rapidly accelerate the transmission of just the returned edited portion,” Taube explained.
This is known as “object differencing.” The appliance identified what it has and never sends back anything that has not changed. As a result, the process saves bandwidth, he said.
Application intelligence also helps the smart appliance to understand payload patterns when the data is encrypted or otherwise distorted, according to Taube. The benefit is a rapid identification of payload. This leads to a rapid access of not only the payload but the whole view of the data.
Vendors that use this approach to application acceleration develop a library of application intelligence packed into separate software packages. The vendor does not have to keep deploying new hardware every time its customer adds another application. Instead, the vendor trains a software package in the language of the new application and uploads it to the existing acceleration appliances, Taube said.
“Applications that have fluency or a software blueprint will accelerate the best. All the rest will use standard transmission technology embedded in the appliance. It is the blueprint that adds to the box’s acceleration,” he explained.
Rather than rely on smart software in application boxes, Silver Peak Systems focuses on data-centric acceleration. This technology deals with the data itself to make it smaller and quicker to transmit over networks. The benefit, according to the company’s engineers, is more long-term capabilities.
All too often, the wide area network (WAN) link is the weak link in data protection. Limited bandwidth, high latency, lost packets and out-of-order packets can all jeopardize strategic data replication and backup initiatives. This results in missed recovery point objectives and recovery time objectives (RPO/RTO), according to Jeff Aaron, director of product marketing for Silver Peak Systems.
As data volumes grow, and as the distance between data centers increases to protect business data from catastrophic disasters, there is increasing pressure being placed on the WAN. While WAN optimization is a decade-old process, engineering efforts now pursue tactical rather than strategic methodologies. This involves adjusting the amount of traffic through the WAN rather than tweaking the inherent limitations of the application itself, he explained.
“We found a 90 percent increase in speed through data tweaking. We are seeing a huge move to target operations that are at the mercy of the local area network (LAN),” Aaron told TechNewsWorld.
For this method to work, customers must deploy appliances to both ends of the transmission link. The combination of appliances and network policies creates a seamless integration with the LAN router, he explained.
Akamai offers its customers a third acceleration choice. Its system of dedicated networks allows customers to put their critical applications over the Internet. This approach gives them a lower transmission cost along with a global reach, according to Tejada.
The cost savings results from not having to deploy hardware boxes or smart software. Instead, customers access one of the nearby global sever networks. Customers subscribe to the networks as a service. There is no third-party hardware or software. All that is required is a simple domain name system (DNS) change in the customer’s IP address, Tejada explained.
“Our system intercepts the Internet connection and sends the data transparently through Akamai’s servers that optimize the data through multiple conversion points,” Tejada said. “This is more than a sub conduit. It is more of an overlay on the Internet. Our system can pick the best route from numerous alternatives over the open Internet.”
The challenge in making this new system work, Tejada explained, is figuring out how to tweak the Internet itself. The Internet has flawed architecture that is not designed for performance.
“It does a good job for redundancy. The Internet is a best-effort architecture,” he said.