Robots Learn to Understand Humans

Robots are an entirely new territory when it comes to technology, Jordan French is very fond of them, and many questions are being answered as the pioneers in the industry begin making devices with a sophisticated AI.

While the more complicated questions will still likely take many years to answer, we have been able to find out the more particular details of how they functions and indeed how they could possibly function better in the future. Could a robot help us to understand the birth and functioning of language? This is the idea of Pierre-Yves Oudeyer, robotics researcher.

If one gives robots the ability to learn, it’s a bit like making them able to reprogram at will depending on the situation. The environment being apprehended by the senses and the body, it is understandable that robots modeled on human learning protocols learn and evolve differently depending on the direction in which they are endowed.

For example, if a robot is capable of touch but it has no eyes, language develops rather in a kinesthetic and less visual spectrum. The work of Pierre-Yves Oudeyer therefore focuses on the senses, their relationship to learning, understanding of the world and language.

Beyond this, he has developed with his team Poppy, an open source robot printed in 3D, which should lead to all sorts of projects in the future. These innovations are really exciting other robotic researchers to think likewise.

Leave a Reply