The Study of visual behaviors for navigation

From Insects to Robots Picture Here Picture Here

A great variety of natural living systems share the same environment and make use of it. They walk, crawl, fly, swim, are able to orient, remember, organize, communicate, survive. They can feel, smell, taste, hear, and see. Humans can also "invent" by assembling "new things" that are not essential for survival: harmonies (such as music, paintings, dance), theories, machines. Disregarding this unique feature (it can be termed creativity or fantasy) humans are not much different from other animals (and to same extend even less complex). Fortunately the goal of robotic science is not to "invent" creative machines but to study and realize machines which can survive in a physical environment doing, if possible, useful things. Robots are not (yet) required to invent but to pursue a well defined goal and being able to cope with known but unpredictable events ("I know I may find an obstacle and I know how to cope with it, I don't know when it will appear and where"). This is very much what all animals do no matter how evolute and complex, if one consider survival (as individual and as species) to be their unique goal. This commonality of intent and environment has produced, during evolution, common behaviors which must be based on similar computational solutions, in spite of the enormous variety of engineering solutions found in natural systems. The research described here is aimed at comparing some of the engineering solutions found in the human and insects visual system and to propose solutions for the behavior control of artificial systems. Particular reference is given to motion control for navigation. Some activities have been explicitly inspired by studies on the behavior of honeybees while others have been implemented with the primary goal of trying to simplify as much as possible the solution of behavioral problems in a navigation environment. Three behaviors have been investigated so far:

With these experiments we intend to demonstrate that, in spite of the differences between human and insect visual systems, the computational solutions adopted have very strong commonalities and can be efficiently implemented in artificial eyes. The main computational tool used in these experiments is optical flow.

Reports and Articles

The Team: Jose' Santos-Victor, Giulio Sandini 

Back to past projects@LIRA