Mainstream computer users soon will see a lot of new developments — thanks to the vision of haptics technology pioneers. Haptics technology, mainly developed to assist the blind when using computers, is bringing the sensation of touch to the Internet and the desktop, and enabling computer users at remote locations to “grasp” objects. Simply put, haptics is leading to a new stage in the computing experience, one that will move the traditional computer interface beyond simple sights and sounds.
Research and development in this field is ongoing at major universities across the United States, funded by the U.S. government’s National Science Foundation, and leading to projects that promise commercial potential in just a few years. For example, researchers at the State University of New York in Buffalo, have have been working on technology to transmit the sensation of touch over the Internet.
“We have added an important dimension to communication: touch sensations,” said Thenkurussi Kesavadas, director of the virtual reality laboratory at SUNY Buffalo. Kesavadas and his team have created technology that could enable users, whether blind or sighted, to remotely learn certain skills, such as sculpture or surgery.
“As far as we know, our technology is the only way a person can communicate with another person the sense of touch he feels when he does something,” said Kesavadas.
Researchers at SUNY Buffalo have completed experiments in which they were able to transmit, from one person to another over the Internet, the sensation of touching a hard or soft object. Users also have been able to discern the contour of particular shapes, such as circles or squares.
The scientists at SUNY Buffalo have called their technology “sympathetic haptics” because it enables one to “feel what another person feels,” said Kesavadas. The technology works by using a virtual reality data glove to capture the softness or hardness of a particular object and then communicate that information to another person, instantly.
The system conveys the exertion of force to the person receiving the data transmission. Transmitted data also can be captured and replayed later, according to Kesavadas.
In addition to research projects, other applications are being developed in the haptics field. For example, medical researchers at Rutgers University in New Jersey last year filed a patent application for a new, PC-based virtual reality system that provides stroke patients with virtual hands.
Researchers working on the project hope the technology will help the patients rehabilitate themselves more quickly than conventional physical therapy ever could. The technology relies on two sensor-equipped gloves. Patients use the gloves to play with various on-screen graphics, such as virtual piano keyboards.
Researchers found a 140 percent improvement in range of motion of patients’ hands after therapy with the virtual keyboards. The research and experiments eventually could give rise to the possibility of an entirely new industry: remote rehabilitation.
“We’ve found that virtual reality alone can be used to improve the condition of chronic stroke patients,” said Grigore C. Burdea, director of the human-machine interface laboratory at Rutgers. “Patients completely immerse themselves in rehab and actually look forward to treatment; as a consequence, the results are fast and dramatic.”
There is also the possibility that haptics technology can increase the public’s appreciation of some art works. At the University of Southern California, researchers have developed technology that will let individuals “feel” what a sculpture feels like at an art exhibit — providing a service that many art patrons have wished for but been forbidden to experience by persnickety curators.
Curators fear, of course, that art objects will be degraded by repeated touching, and, as anyone who has seen a worn handrail or doorknob can attest, that fear is well founded. To allow museum visitors to “touch” art, USC researchers have created a haptics glove with a network of exoskeletal electronic tendons.
Connected by a USB cable to a PC, users can experience the shape and contour of a statue. The researchers already have conducted field experiments at the Fisher Gallery, an art museum on the USC campus.
The ultimate goal of the project is to allow art patrons to experience the scale and weight of an object without ever having to touch the real thing. That’s something that could make the Rodin Museum in Paris, for example, more accessible to the average person.
Researchers say that the force control algorithms used for rendering touch sensations require very high sampling rates and very low latency, usually just a few milliseconds. This means that to make the technology possible at all over the Internet, it likely will be necessary to distribute databases containing the force control algorithms so that local copies of the information are cached at various nodes of the network.
Doing so would enable faster access to the dynamic and static properties of the objects being manipulated — from shape and size to position and velocity. But widespread adoption of this magnitude will require a major push on the consumer or corporate front — a much more powerful push than the relatively small market for force-feedback joysticks can muster.
In coming years, however, researchers will be taking haptics to the next level, more tightly integrating the technology with related feedback types like sound and voice. Whether or not the technology will catch on remains to be seen. As of today, the limited amount of haptics technology entering the marketplace — from force-feedback mice to virtual-reality gloves — has met with only limited success.
Still, that limited success could quickly change into widespread adoption if haptics researchers can move their ideas into the marketplace.