Researchers at Carnegie Mellon University unveiled a new haptic technology that will enable people to manipulate three-dimensional objects on their computer while experiencing real touch sensations.
The technology, which is currently best suited for gaming and virtual world training simulations, was developed by research professor Ralph Hollis and his team at The Robotics Institute at Pittsburgh’s Carnegie Mellon University.
The Magnetic Levitation Haptic Interface is a bowl-shaped device, called a “flotor,” that has a control stick attached to it and six coils inside which interact with magnets underneath. The interaction between the electricity and the magnets levitates the flotor and the control stick enables people to interact directly with the computer graphics. The levitation system removes the mechanical engines and devices that normally power haptic devices, giving users a cleaner tactile sensation.
“The elimination of mechanics entirely puts us in a different performance regime,” Hollis told TechNewsWorld. “The goal is to make the haptic experience approach transparency. One of the principle goals of haptic research is to make the equipment go away so the user experiences the virtual environment and simulation directly.”
The magnetic levitation allows the device to give force feedback for a wide range of surfaces. The device, Hollis said, could be used in virtual dentistry training simulations, giving dentists the ability to feel the hard enamel tooth and soft tissue while working on their lessons.
While Hollis’ group continues working on its prototyping, companies such as San Jose-based Immersion have worked to enhance existing products.
Computer gaming has been one of most popular markets, with force-feedback controllers; however, the company’s fastest growing market segment is medical simulation training software. Immersion, though, is also working with the automotive industry, creating in-car touch screens that enhance navigation and lighting, as well as mobile carriers that need to display large amounts of information in a small area.
No matter what the industry, users have the same issue with digital displays: They need some reassurance that they’ve hit a button.
“The problem with going to touch screen is you’ve given up the knowledge that you have pressed a button, so you aren’t sure that you’ve hit anything,” Mike Levin, the vice president of IP strategy at Immersion, told TechNewsWorld. “Adding haptic feedback can improve the satisfaction for these devices.”
From Practice to Practical
For Hollis and his team at Carnegie Mellon, improving the tactile response in a virtual simulation solves the problem Levin described.
The technology’s only downside, Hollis said, is that the control stick and the workspace are relatively small, making it inefficient for simulations and virtual experiences that require people to use more than their hand.
Much like a mouse which uses small movements to move the pointer and cursor around on a computer screen, he said people who have used the device tend to lose themselves in the graphics and the tactile feedback without realizing the limitations of the space.
“The principal downside to this technology is the limited range of motion,” Hollis said. “You can move in about a 20 millimeter sphere. However, you tend to focus on the graphics and you don’t really notice that you are moving in a very small area with your hands. You become immersed in the graphics.”
The interface is still in its prototype development, but Hollis will take the technology commercial later on this summer with his company, Butterfly Haptics, though he hasn’t yet announced how the technology will work in the commercial environment or who potential clients might be.