8 min

Teaching robots to make better decisions

Researchers at KTH Royal Institute of Technology are developing robots that learn new tasks on their own, that can understand an object is a cup even though they have never seen anyone drink from it. The aim is for robots of the future to be able to perform tasks as complicated as humans can manage.

Project Grant 2014

Interactive Physical Systems: Moduli Spaces, Inference and Control

Principal investigator:
Danica Kragic Jensfelt, Professor of Computer Science

Co-investigators:
Dimos Dimarogonas
Aaron Bobick
Florian Pokorny

Institution:
KTH Royal Institute of Technology

Grant in SEK:
SEK 18.6 million over five years

A robot is standing in front of the cooker making a meal. It needs to stir the stew, but cannot find a ladle. How is it to understand that it can just as well use a spoon, but not a pen?

This is the kind of question that Danica Kragic Jensfelt and her colleagues at KTH are working on in a project being funded by the Knut and Alice Wallenberg Foundation.

“We humans use our senses to understand our surroundings – we look at and feel the material the spoon is made of to decide whether it will be good for stirring with. Likewise, our aim is to enable robots to combine information from their sensors so they can make more intelligent decisions and interact better with the world around them,” Danica explains.

From industrial robots to smart home help

Robots have played an important part in industry for many years. For instance, they are used to assemble, weld and lift components. They can operate in hazardous working environments and perform tasks with great precision. More recently, robot technology has entered our homes. Nowadays there are self-propelled robot lawnmowers and vacuum cleaners. But robot characteristics and behavior largely remain limited to simple movements and pre-programmed tasks.

Danica and her fellow researchers want robots to interact with their surroundings in a more advanced way, and perform more complicated tasks. These include moving around in our homes to fetch things, cooking, helping in toilet use, monitoring people who need assistance, and raising the alarm in case of danger.

To do these things a robot must be able to adapt to its surroundings and cope with very varying situations – far from the static environment of a conveyor belt in a factory. The researchers at KTH are therefore trying to get robots to learn new tasks on their own, with the help of information they already possess, and receive from the people around them.

“It is not unlike how children learn. They collect information in various ways, see what their parents do, try it themselves, and then try again. Studying the interaction between different sensors is a fairly new area of robotics; the focus has hitherto been on information from individual sensors,” Danica explains.

Similarities make hard things easy

Danica lifts down a half-meter-long humanoid robot from a shelf. The project is based on robots that can pick up and move objects using rudimentary hands fitted with contact sensors. They can see with the help of cameras, hear sounds via microphones, and determine the weight of an object. But the basic research is not dependent on the form of the robot itself, and in the lab robots with more industrial functions are trained to grip and move bottles of detergent, cups and balloons.

A key element of the research is to develop ways for robots to obtain and act on information. Mathematical methods are used to develop new theories, and new ways of organizing all data generated by the robot’s sensors – data in the form of images, sounds or something the robot perceives – so they can be used by the robot to make decisions.

“We are trying to find relationships between various objects so that the robot can infer things about objects it has not seen before – it can generalize. Even if it has not seen a certain cup in use, it will be able to say it is a cup since it resembles another cup it has seen before – perhaps in terms of materials, weight and shape,” Danica says.

A simple example is using co-ordinate systems of various kinds to describe an object geometrically in order to determine the ways in which it resembles other objects.

Picking up a book or a cup with equal ease

The project involves collaboration between researchers from various disciplines: mathematics, robotics, control engineering, computer science and machine learning. This is needed to go the whole way from mathematical models, via computer programs, to testing to see that the robot behaves as the researchers want it to.

So far they have shown that a robot that knows how to hold a cup can use that knowledge to pick up a book, an object that is geometrically very different.

“With the theory we have begun to develop, the robot can generalize between different objects, different grips, and to an extent different materials,” Danica says.

The ability to generalize is a hot topic in robot technology, and is a key ability if robots are to become more independent.

The research project has been awarded funding for five years. During that period Danica’s aims include showing the type of mathematics whereby learning is most effective, and showing that a robot can adapt the force it uses to pick up an unknown, fragile object without breaking it. Another objective is for a robot to be able to say what the similarities are between apples and tomatoes.

“Is it just the shape, or is there anything else that the robot can measure? It is essential that we answer fairly straightforward, focused questions. This will make it easier for the robot to make decisions more intelligently,” Danica concludes.

Text Sara Nilsson
Translation Maxwell Arding
Photo Magnus Bergström

 

More about Danica Kragic Jensfelt’s research