Robots are everywhere. But for them to be useful, they have to be programmed by people. Computer scientists are now looking for ways to teach robots how to teach themselves.
Lars Schillingmann, a computer scientist at CoR-Lab, a research institute for Cognition and Robotics at the University of Bielefeld, is playing with toy cups.
"Look, the green cup goes in the blue one, the yellow one goes in the green one and the red one goes in the yellow one."
Sitting at a table in a darkened room, he is surrounded by monitors displaying computer code and graphics.
As he patiently stacks the cups, a research robot stands in front of him.
ICub is the size of a small child with a plastic head, big eyes and chubby cheeks.
The robot, complete with legs, arms and head, closely follows what Schillingmann is doing at the table. He repeats what he sees.
"The robot," says Schillingmann, "sees what I show him."
"When I show him a green cup, you see that there in the picture [on the computer screen], and how he evaluates the instruction."
Scientists working on the project want ICub to be able learn how to combine and compute - or understand - both visual and acoustic information as it is expressed by people.
"He listens for which words I emphasize. For example I can say to him, the blue cup. Then he knows, the blue cup is interesting. And if I say, the blue cup while showing it to him, then he knows he needs to combine these two bits of information."
Since 2007, CoR-Lab computer scientists have been working with engineers and neuroscientists, psychologists and linguists to develop machines that can adapt to human behavior and are also able to learn from people.
With robots more common in today's workplaces and households, it's important for robots to be able to interact and adapt to working with humans, rather than rigidly sticking to how they have been programmed.
"Please state your command!" says Jenny, a household robot prototype located at Bonn-Rhine-Sieg University in a high-pitched voice as she makes her way to the dining table and cleans up.
"I'm moving to the dining table. The table is now clean," she says without making a fuss when her chore is complete.
While robot prototypes like Jenny have the ability to clean tables, and even make pancakes, says Sven Behnke, a professor at the Institute for Computer Science in Bonn, such tasks really aren't that useful, not yet anyway.
Learning by doing
CoR-Lab in Bielefeld has taken a different approach and developed cognitive robots.
"If in future we want robots to behave in a flexible manner, without having to program them first, we'll still have to teach them things. And we can do that by programming them to recognize what we're saying and showing them," says CoR-Lab's Lars Schillingmann.
He says ICub should learn like children learn from adults - by listening, imitating and trying things out.
That is why the structure and sequence of the cup experiment was based on cognitive science studies of learning behaviors in three-year-old children.
It is hoped that ICub will one day be able to do more than simply recognize the properties ascribed to a particular object, or only copy certain procedures.
Schillingmann says robots like ICub will have to be able to understand instructions and carry them out independently.
"So, we can really teach him how to act," says Schillingmann. "In future it will be just as easy to communicate with robots as it is with people."