Emerging Tech

The Advent of the Superhumanly Intelligent UI

Competition in the keenly contested smartphone market is driving massive change in user interfaces (UIs). Mice and keyboards are so yesterday; touch, multitouch and gestures form the core of UIs today.

Now, we have Google TV with apps on the TV set. The media of television, streaming video and the Internet are becoming more closely intertwined.

Where is this taking us? What might we see five years from now in terms of user interfaces?

Doin’ It Right in the Wrong Side of Town

User interfaces will go in a whole new direction, Peter Eckert, cofounder of Projekt202, told TechNewsWorld.

“Mobile devices will become the bridge behind other devices,” Eckert said. “What if your mobile device becomes an extension to other devices? What if you can wear it?”

The problem right now, Eckert suggested, is that companies offering smart devices or multiple tools are designing each one to handle one facet of life.

“Look at the conglomerates of this world — Sony, Samsung or Honda,” Eckert said. “They offer almost any device you can imagine. Samsung has TVs, mobile phones, appliances and refrigerators, and yet they’re reinventing the UIs on them all over and over again and failing completely to look at what makes you go to the refrigerator in the first place. Why do you turn on your washer and dryer an hour earlier today?”

New UIs may be designed to deal with why people do what they do and how to help them do it, Eckert remarked.

“I want to find ways where I, as the end user, don’t have to worry about how to do things, and just need to focus on the doing,” Eckert said. “Right now, from device to device, I have to understand the conventions for each device I use.”

Working Together

Ultimately, we will get to the point posited by Vernor Vinge back in 1993 — a point at which computer networks and their users meld into a superhumanly intelligent entity and computer/human interfaces become so intimate that users may reasonably be considered superhuman, Eckert proposed.

That singularity is perhaps what Google CEO Eric Schmidt envisioned when he spoke recently about Google storing users’ memories and information, which they can call up at will.

The UI of the future will enable social interaction in real time, Harry Brignull, user experience consultant at Madgex and 90 Percent of Everything blogger, told TechNewsWorld.

“Even when people gather around a non-digital surface like a whiteboard, they interact socially and they use the whiteboard together at the same time,” Brignull pointed out. “Why don’t we have the technological equivalent to this yet?”

This would require simultaneous, multi-user interaction along the lines of multiplayer gaming consoles, but taking that idea further along. While games consoles are great for gaming together, they won’t let users swap photos or set up activities on social networks, Brignull said.

The living room of the future will contain multiple screens ranging from mobile devices to tablet PCs to TVs and wall displays, all of which will interact with each other.

“You’ll be able to walk into someone’s home with your own devices, and it’ll be trivial to link them all up,” Brignull said. “I’ll be able to use my handset to flick photos up onto your TV screen with a single gesture. If I want to play you something from my music collection, I’ll be able to broadcast it to your home stereo in a couple of taps.”

Microsoft’s Surface and Kinect technologies are a step in the right direction, Brignull said.

Surface is a multi-touch computer that responds to natural hand gestures and real-world objects. Kinect, code-named “Project Natal,” is a control device for the Xbox 360 that uses a webcam-type peripheral to let users control and interact with games through a natural user interface using gestures, spoken commands or presented objects and images.

The Rise of the Machines

In order to fulfill the vision of new UIs enabling new methods of man/machine interaction, we will need hardware power. Lots of it.

“I expect the rise of intelligent interfaces shortly through focusing on the extra processing power being provided by graphics chips to make interfaces more intelligent,” Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld. However, intelligence will probably lag behind computer-based products by about a decade, he added.

“We have seen for some time that hardware acceleration has been the mantra for making UIs more animative, immersive and, certainly, engaging,” Al Hilwa, a research director at IDC, told TechNewsWorld. “At the same time, we’ve had a revolution in the smartness of mobile devices, which moved from pen to touch, entirely enabled by new hardware capabilities.”

New mobile devices have embedded operating systems and capabilities and offer application programming interfaces to enable new capabilities such as accelerometers, proximity sensors, gyroscopes, compasses and GPS radios. In turn, these drove changes to and modifications of UIs. Consumer device UIs will continue to evolve.

The Consumer Is King

As consumer mobile devices get more sophisticated and smarter, they’ll drive the evolution of enterprise devices and PCs.

“Mobile devices and the Web before them democratized computing and moved the center of gravity to the consumer,” Hilwa said. As consumers began bringing their mobile devices with advanced UIs into the workplace, they forced changes in enterprise UIs. “Now we have a situation where enterprise mobility has become complex and expensive, but also the UIs being used for corporate applications are outdated,” Hilwa said.

Enterprise UIs might end up going the route of consumer UIs — where you have streamlined UIs and download apps for additional functionalities.

“Apple showed everyone you don’t have to make a phone that does everything; you just offer a phone that’s simple and easy to use and offer a platform so people can download the functionalities they want,” Projekt202’s Eckert said.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

Technewsworld Channels