Emerging Tech

IBM: Computers Are Going to Start Making a Lot More Sense

IBM listed technologies based on the five human senses in its 2012“5 in 5” — five innovations that will change lives within five years. The predictions include advances in cognitive systems to unlock the door to touch, sight, hearing, taste and smell.

Hendrik Hamann

Physical Analytics Research Manager Hendrik Hamann examines an array of wireless sensors used to detect environmental conditions such as temperature, humidity, gases and chemicals. In five years, technology advancements could enable sensors to analyze odors or the molecules in a person’s breath to help diagnose diseases. (Credit: Jon Simon/Feature Photo Service for IBM)

The list that IBM makes each year doesn’t reflect what its engineers, scientists and developers want under the tree, but where they see technology making some of its biggest advances in the near term.

They reflect “a lot of things going on in the laboratory for the year, or in some cases several years,” IBM CTO of Telecom Research Paul Bloom told TechNewsWorld.

There are teams “doing a lot of work around video and multimedia analytics,” he said, that helped shape the list this year.

Cognitive Systems

Technologies for computing the five senses have their basis in cognitive systems that allow computers to learn, adapt, sense and form conclusions with less human input.

“This year we’re focusing on the next era of computing — cognitive computing,” said Bloom. “The reason we focus on the senses is because the senses stimulate action in the human. We’re not looking to replace the human with a machine. These are adjunctions to a brain-assist system.”

Developments occurring in computing today are allowing technologies using cognitive systems to progress at a faster rate.

“We see advances in supercomputing, in nanotechnology and in neuroscience that are coming together that allow us to really understand how the brain works,” Bloom said.

C’mon, Touch Me

Advances in haptic, infrared and pressure-sensitive technologies allow devices such as smartphone screens to replicate the feel of an object. Vibrations can emulate the feel of a silky fabric, rough Velcro or other surface, for example.

“Think about how e-commerce will change if one has the ability to reach out and touch something,” said Bloom.

IBM sees these technologies being applied not only in e-commerce, enabling consumers to feel the texture of a garment, but also in the medical field, allowing a doctor to assess a wound remotely.

The Eyes Have It

While touch tech will allow users to get a better feel, literally, of virtual objects, advances in sight technologies will give computers superhuman vision. Computers see images, photographs and video as pixels, and they rely on tags to determine what is in an image. IBM and other companies are developing systems that let computers analyze images pixel by pixel.

In medical applications, computers will be able to analyze medical imaging technologies from MRIs, CT scans, Xrays and ultrasounds to identify tumors, for example. This technology could allow computers to detect problems that humans aren’t able to discern in standard images.

“A system will be able to understand and read an Xray and be able to see things that a human can’t,” Bloom said. “A system like this could be more accurate than the capabilities of a human being.”

In addition to medical applications, the technology could find uses in creating smarter cities.

“Every city is starting to deploy cameras for crowd control and traffic,” Bloom said. Think now if these cameras are attached to a system that understands what it’s seeing. Integrating all this information, it can really change the way cities get managed.”

Sound’s Good

Computers listening for sounds, vibrations, sound waves and even sound pressure will be able to detect and predict events in ways that humans can’t. Sensors along fault zones could provide early warnings of earthquakes, for example, allowing areas to be evacuated before actual seismic activity started.

Sound sensors, combined with other sensors to measure vital signs, might give babies a computerized “voice.” Computers could decipher “baby talk” by listening to infants coo while observing physiological information such as heart rate, pulse and temperature. This information could determine whether a fussy baby is hungry, tired, in pain, or in need of attention.

“People have been trying to figure out when a baby babbles, what it’s saying,” noted Bloom. “If we can now collect information and correlate it, we will be able to translate that babble.”

The same technology could be applied to people with communication disabilities.

A Matter of Taste

A deeper understand of taste, down to the molecular level, can help program foods to be more appealing, perhaps enabling people to get more enjoyment out of healthy foods.

“This is creating digital taste buds that will help us eat smarter,” said Bloom. “We will be able to provide ways of making broccoli more palatable.”

The technology uses algorithms to determine structure to craft flavors, or to understand why one person likes a flavor that others do not. The technology can be used to add more appealing flavors to healthy foods.

It could also be used to help people with diabetes — or those who follow strict diets for any reason — find their meals more appealing.

The Smell Test

Computers of the future might be armed with smell receptors to detect the presence of germs and illness, or replicate the smell of objects, such as produce, to bolster e-commerce.

IBM is working on technology that will detect the presence of antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus, or MRSA, in a room.

In IBM’s labs, other scientists are working on technology for cellphones that will detect the early onset of the common cold or other health conditions from your breath as you talk on the phone.

“There is some work going on in the near term in this area, where a sensor can detect odors, biomarkers, and different molecules in someone’s breath,” Bloom explained. A person can speak into a phone, for example, and “the phone can say there’s an 80 percent chance of coming down with a cold.”

The Road Ahead

IBM’s 5 in 5 predictions are based on work going on in its labs and at other companies to develop cognitive computing systems and technologies that replicate and augment the five senses in new ways.

“What we’ve seen is the amount of processing capability and amount of storage space is so great, you need the next generation in computing to handle this in real time,” said Bloom. “As we look to where new technologies bring us, it allows us to take our limitations away and allows our imaginations to see what life will be like in five years.”

These advances are meant to enhance the work of humans, not replace it. Computers will be able to detect and analyze beyond human abilities.

“Beyond mimicry, the task is information synthesis and sensemaking, and even artificial sensation production, in an effort to retool human-machine interfaces to more naturally suit the ways people experience the world and consume and use information,” Alta Plana Corporation founder Seth Grimes, an industry analyst who covers sentiment and content analysis technologies, told TechNewsWorld.

The advancements covered in cognitive computing have near-term uses, but also look down the road for more ways computers can aid humans in their daily lives.

“What IBM is envisioning in the near term is advanced computers that could have these capabilities to provide systems to support human beings,” Institute for Global Futures CEO James Canton, Ph.D., told TechNewsWorld.

“It’s not just about systems that can imitate human sensation — it’s about the higher order of cognition that I think is the ultimate endgame for cognitive computing,” he pointed out.

This is the seventh year that IBM has published a list of five innovations that will change our lives in the next five years.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Technewsworld Channels