Enterprises have armed themselves with a bevy of management and monitoring tools. These products calculate bandwidth utilization, gauge system response time, and identify system and network bottlenecks. Yet amid all the information gleaned, for many firms a simple question often goes unanswered: How is our IT infrastructure performing?
A group of vendors, dubbed the Apdex Alliance, have developed a new metric to answer that very question. Performance monitoring is the process of making sure that a company’s computing infrastructure adequately supports its applications. For instance, once a user logs onto a Web site, an application should deliver the appropriate content within a certain time.
Because enterprise infrastructures are quite complex, more than 100 vendors have developed many products that rely on different measurements to make such determinations.
Simplifying the Process
“Performance numbers presented to management should be easy to understand, but they should also be firmly grounded in a careful analysis of what is actually important to the organization,” said Eric Siegel, senior analyst at the Burton Group, a market research firm.
Many of the products available to date have not adequately correlated the reams of information collected with data that is important to an organization, so the Apdex Alliance has tried to simplify the reporting process. “There are so many metrics collected on performance that, in many instances, it gets to be a case of information overload,” said Peter Sevcik, president of market research firm NetForecast and executive director of the Apdex Alliance.
Rather than offer a wide-ranging collection of items, the Application Performance Index, or Apdex, provides IT administrators with a single metric and a simple evaluation method — the system is based on a scoring system with values between 0 and 1 — that ideally corresponds to the information that management desires.
A Complement, Not a Replacement
Apdex is meant to be an addition to, rather than a replacement of, a firm’s current monitoring tools. After collecting performance information, vendors — or users, in some cases — funnel the data into their reporting systems and generate a new report that features a single value between 0 and 1 that represents performance.
“The idea is to provide a CIO and other managers with a quick and easy way to see which applications need improvement and which are meeting corporate objectives,” NetForecast’s Sevcik told TechNewsWorld.
To accomplish this, the group divided its ratings into three sections. A “satisfied” rating means the user is productive and the system is meeting its target goal. A “tolerating” rating means the user notices that performance is lagging behind the goal, but users will still continue to complete the process. A “frustrated” grade means that performance is unacceptable, so users may abandon the process.
These satisfaction measurements are set by each company; for instance, a firm may determine that online transactions for its order processing system need to be completed in five seconds. The company then establishes the values for the “tolerating” and “frustrated” ratings. During a set period — it could be an hour, a day, a week, or a month — the corporation collects performance information and generates an Apdex report.
Setting Appropriate Parameters
If all of the firm’s transaction responses were completed in less than five seconds, the Apdex score would be 1. If they all took one minute, the rating would be 0. In almost all cases, the benchmark falls somewhere in the middle.
The company needs to determine what number it wants to target — say 0.85. If an application meets the objective, no changes are needed. If not, minor or major ones could be discussed and implemented.
The Apdex Alliance — which has more than a dozen members, including Akamai, Compuware, Expand and NetQOS — began working on its specification in May 2005 and completed it last fall. The group has also put a compliance test in place so users can be certain that specific products generate consistent performance metrics. Symphoniq is the first vendor to have its product certified, and the alliance expects a dozen to complete the process by the end of the year.
Voice and Video on the Docket
The progress does not mean that the group’s work is complete. Apdex is designed for data applications, and companies run other types of applications on their networks. Voice applications rely on a series of items to measure performance, but more consistency and simplicity is needed. Performance monitoring tools are just beginning to emerge in the video market, so there is nothing like Apdex in that area.
While the new measuring stick is needed, it is not a panacea. “One challenge that companies face is figuring out what it is they want to measure,” notes J. Jeffrey Nudler, senior analyst at market research firm Enterprise Management Associates. Corporations have to determine which applications are the most important, what response times are reasonable, and what the impact of poor response time is on the firm.
To do that, the IT department and business units have to understand each other’s operations thoroughly — something that is more a wish than a reality in most firms.
Another challenge is the constantly changing nature of corporate applications. The number of users working with an application, the amount of time they spend accessing it, and the way they interact with it is dynamic. Performance monitoring tools only provide a company with a static snapshot of how it is operating at any particular moment, so a firm’s Apdex metrics often need to be updated.
Despite the limitations, the alliance’s work is expected to gain traction. “While users need to be aware that there is no quick and easy silver bullet when it comes to performance monitoring, the Apdex Alliance’s work is something that many corporations should find helpful,” concludes EMA’s Nudler.