Emerging Tech

Machine Speak: Robot Baby Learns Words

It’s a cute little robot learning how to say “green” and “blue.” And as part of a major project undertaken by robotics researchers at the University of Hertfordshire, it’s also promising to transform perceptions of how robots — and humans — learn language.

DeeChee, which is built to look, act and learn like a 6-to-14-month-old child, is the subject of a report recently published in PLoS ONE by Caroline Lyon, Chrystopher L. Nehaniv and Joe Saunders, researchers with Hertfordshire’s Adaptive Systems Research Group.

DeeChee is designed on the open source iCub platform, which is available for anyone to create similar robots for a variety of types of research, including language acquisition.

iCubs are designed to appear like human children, based on the notion that the appearance of a robot affects how people interact with it. Their design also draws on the “embodied cognition hypothesis,” meaning that the robot’s own perceptions and development depend, in part, on its embodiment in humanoid form.

Emerging Words

DeeChee was programmed with basic sounds and a learning algorithm, approximating the capabilities and hard-wiring of a human baby. It was then subjected to intensive language training and tests, and over time it appears to have learned some basic words.

“The advent of humanoid robots has enabled a new approach to investigating the acquisition of language, and we report on the development of robots able to acquire rudimentary linguistic skills,” the researchers noted.

“In our experiments some salient one-syllable word forms are learnt by a humanoid robot in real-time interactions with naive participants. Words emerge from random syllabic babble through a learning process based on a dialogue between the robot and the human participant, whose speech is perceived by the robot as a stream of phonemes,” they explained.

The robot has made significant progress in acquiring, learning and speaking words, the team reported.

“Word forms are usually produced by the robot after a few minutes of dialogue, employing a simple, real-time, frequency dependent mechanism,” they wrote. “This work shows the potential of human-robot interaction systems in studies of the dynamics of early language acquisition.”

This research might, in fact, reveal as much about humans as it does about robots, according to Chris Robson, chief scientist with Parametric Marketing.

“There’s always a tension in artificial intelligence and robotics about how much we as the programmers teach up front, and how much we leave up to learning,” Robson told TechNewsWorld.

“What’s interesting about this research is that they’ve gone back to the very foundations of language development,” he explained. “That’s interesting both from the point of view of artificial intelligence and robotics, and also from the perspective of understanding human development. Outside of robotics, this might help us to understand how we as humans learn — how much is preprogrammed and how much is learned. It’s a very brave and fascinating approach.”

Real Language?

This research is part of a contemporary movement that recognizes language acquisition does not occur in a void, but in the real world with embodied creatures — whether they’re humans or robots, said Yaki Dunietz, president of Ai Research.

“In the past, most research in automated language acquisition was done with computer software — there was no need for a physical counterpart to the logical learning machine,” Dunietz told TechNewsWorld.

“Robotics projects, on the other hand, focused on the physical aspects: vision, motor control, etc. The iTalk program merges the two together: It tries to teach a physical robot to use natural language,” he explained.

“It is unclear why iCub should do any better than a non-physical counterpart — i.e., a software program designed to engage in conversation with a human trainer, and learn from him to speak in a manner similar to language acquisition by infants,” noted Dunietz. “It will be interesting to see how a bot who also possesses a physical body learns to speak better than a bodiless one.”

Is it really language, however, that DeeChee is learning? Or is it just rote repetition of sounds?

“It is definitely learning some language,” said Dunietz.

“The question is how much language is being learned. If the robot learns to respond to a certain input with a certain output, it has already learned some language. But computers have learned some (human) language from the day they were conceived,” he pointed out.

“The question is, of course, whether it shall pass the Turing Test: Whether it can converse, in plain English, with a native-English speaker, so that this competent speaker will grade the conversation as indistinguishable from that of a human,” Dunietz concluded.

Although this study is regarded as a step forward in ongoing language acquisition and robotics research, much remains to be done in the field.

“It seems that some learning was achieved, although the authors discuss many limitations,” Paola Escudero, senior lecturer at the MARCS Institute and visiting professor at the brain and cognition research program at the University of Amsterdam, told TechNewsWorld.

“I think it is a good first step, but the authors need to incorporate more important empirical findings to their models for their robot to perform more like a real infant/child,” Escudero said.

The gold standard is the acquisition of natural language that reflects true intelligence, according to Dunietz, and research like this brings humans ever closer to that goal.

“When computers are able to speak in a natural language, in a way indistinguishable to humans, that would be the final stage of a new, non-biological artificial life-form,” he said. “If Alan Turing was right in placing ‘intelligence’ or ‘rationality’ in the ability to hold a human-like conversation, then we are a step away from these new creatures: robots, or as I prefer to call them, ‘virtual personalities.'”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Vivian Wagner
More in Emerging Tech

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

Technewsworld Channels