Changes are afoot in many enterprises as more and more of them switch from traditional voice connections to Voice over IP (VoIP) services. Market research firm Gartner found that three out of every four PBX lines sold in 2005 were VoIP links; that number is expected to grow to more than four out of every five this year.
The new network option is gaining ground primarily because it tends to be cheaper than traditional lines. In addition, these systems are able to support new converged services, such as instant messaging and desktop multimedia conferencing.
While VoIP is proving to be enticing, it does have limitations. One issue is IP network reliability. “Call quality has been a major issue ever since VoIP networks were introduced,” noted Matthias Machowinski, an analyst with Infonetics Research.
The reason for that lies in IP network design. In traditional networks, voice connections are given a dedicated line so there are no interruptions. In IP networks, information moves from place to place based on which links are available. A call may have an open link at one moment, but then a large file transfer can usurp much of the available bandwidth.
With data applications, the impact of such fluctuations is slight because information can be recompiled on the receiving end. With voice connections, that limitation is more problematic as parts of a conversation may be lost. Not only are voice connections more susceptible to problems from bandwidth fluctuations, but they may also drop information whenever there are delays on a line, something that is also common on IP lines.
Primary Means of Communication
These items are disturbing to IT departments because phone calls remain executives’ primary mean of communications — despite the recent increasing reliance on e-mail — and voice call quality seems to be getting worse. “As more users move to VoIP networks, IP network limitations become clearer,” said Jeff Snyder, a research vice president at Gartner.
Solving such problems begins with determining how to measure voice call quality. Because there are so many network variables, data network quality has not relied on a single metric to determine performance. In the telephony world, however, a single number is usually given to rate call quality. Traditionally, call quality testing has been subjective: a user picks up a telephone, listens to a call, and then rates its quality. The leading measurement standard has been the Mean Opinion Score (MOS) developed by the International Telecommunications Union.
Through the years, the ITU has tried to streamline the gathering of call quality data. The organization’s P.563 standard can be used to automate call quality monitoring. The group also developed the Perceptual Speech Quality Measure (PSQM) standard, which sends a reference signal through a network and then compares the network performance by means of various algorithms.
MOS scores range from 1 for unacceptable to 5 for excellent; suppliers typically try to generate scores that are in the 3.5 to 4.2 range, numbers IP networks often do not attain.
The problems arising from declining call quality are just now becoming evident to many organizations as VoIP deployment becomes more widespread. “Companies had been working with small VoIP deployments, but many now are rolling these services out across the enterprise,” Infonetics Research’s Machowinski told TechNewsWorld.
Once they recognize the problems, they need tools to fix them. The PBX products come with monitoring tools but they are not geared to addressing call quality issues. “PBX management systems are designed to tell customers whether or not the equipment is functioning, not what is happening on the network,” said Gartner’s Snyder.
Vendors such as Apparent Networks, Brix Networks, Empirix, Integrated Research, Psytechnics, Qovia, Spirent and Telchemy have focused on filling this void. Their products include active and passive monitoring tools. The active products install thin clients on various endpoints — such as phones, gateways and call servers — and generate local readings on what is happening there.
This intrusive method allows for tightly targeted testing of specific network links and elements because the measurement establishes baseline comparisons. The weaknesses with these tools are that the simulated traffic eats up network bandwidth and testing can be difficult when information travels from an enterprise to a carrier’s network.
The passive approach includes stationing agents at different points along the network and examining information as it travels back and forth. While this approach requires less bandwidth, it does not provide as much information about how individual components are functioning.
Early in the Adoption Process
At the moment, few companies rely on these tools. “Currently, most companies are not aware that VoIP monitoring tools are available or what benefits they offer,” Snyder told TechNewsWorld. More vendor education is needed to address this issue.
Another issue is voice quality testing and monitoring are expensive: prices start at the $10,000 range and can quickly run into six figures.
In most cases, users have been unwilling to make such investments, but that outlook is expected to change in the coming months. “Typically, management functions are the last item that is put in place as a technology matures, and that formula is holding true with VoIP,” concluded Machowinski.