Disney Research has developed an algorithm that will render 3D textures and tactile features on a touchscreen, it announced earlier this week, enabling users to “feel” the on-screen images they touch and see.
The technology works on still images as well as streaming videos, and it achieves its effect by altering the friction the user’s fingers encounter as they glide across the screen’s surface. In this way, it creates the perception of 3D objects and textures on a touch surface without having to physically move the surface.
“Our brain perceives the 3D bump on a surface mostly from information that it receives via skin stretching,” explained Ivan Poupyrev, one of the Disney researchers. “Therefore, if we can artificially stretch skin on a finger as it slides on the touchscreen, the brain will be fooled into thinking an actual physical bump is on a touchscreen even though the touch surface is completely smooth.”
Poupyrev and Ali Israr, the research lead on the project, demonstrated the technology at the ACM Symposium on User Interface Software and Technology this week in Scotland.
Disney Research did not respond to our request for further details.
More About The Technology
Disney’s algorithm simulates rich 3D geometric features such as bumps, ridges, edges, protrusions and texture on touchscreen surfaces.
It is based on the hypothesis that friction-sensitive mechanoreceptors in the skin of a human finger sense minute surface variations when the finger is dragged across an object. Modifying the friction forces between the fingertip and the object’s surface would therefore create the illusion of variations in the surface.
“Basically, the friction on the screen is changed depending on what is being presented,” Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld.
The perception of a 3D bump is created when local gradients of the virtual bump are mapped to lateral friction forces. In layman’s terms, people will feel a 3D bump when the differences in height or width of objects in a virtual map are mapped out onto friction forces on the touchscreen.
‘Effects on the Fly’
More precisely, the algorithm calculates the gradient of the virtual surface to be rendered and determines the dot product of the gradient of the virtual surface and the velocity of the sliding finger.
It then maps the dot product to the voltage using the psychophysical relationship between the voltage applied to the display and the subject strength of friction forces.
“The traditional approach to tactile feedback is to have a library of canned effects that are played back whenever a particular interaction occurs,” Israr said. “This makes it difficult to create a tactile feedback for dynamic visual content, where the sizes and orientation of features constantly change.
“With our algorithm we do not have one or two effects, but a set of controls that make it possible to tune tactile effects to a specific visual artifact on the fly,” he added.
The algorithm is lightweight and can easily be implemented in real time, the researchers said.
The tactile technology could be used in education, medical applications and certain applications for designing products, Jim McGregor, principal analyst at Tirias Research, told TechNewsWorld.
“Probably the most compelling applications,” he added, “will be consumer applications and immersive gaming.”