Insect-inspired robotics

Building autonomous mobile robots necessarily involves solving tasks like collision avoidance, gaze control, attitude control, navigation, and decision making in cluttered environments. Many of these tasks are either not satisfactorily solved in mobile robots or at least they are not usually solved on the basis of optic flow information.

Transferring the visual control mechanisms from insects to mobile robots is a promising approach, also from the engineering perspective. Our models of the computational mechanisms underlying visually guided behavioural components in blowflies and honeybees are computationally lightweight in comparison to classical robot vision approaches. Our models can directly be implemented as control systems on mobile robot platforms and complement their control framework for autonomous operation. Insects actively shape the input to their visual motion detection system by applying a saccadic movement and gaze strategy. Therefore, applying the mechanisms of insects to robot control also involves the transfer of these movement strategies on the robotic platform. We implement and test the models in closed-loop simulation with images generated by computer vision as well as on mobile robots carrying cameras that provide panoramic images. Insect visual systems dynamically adapt to the stimulus properties. We investigate which of these adaptive mechanisms are useful in the transfer to any technical systems. Real-world tests are performed on a range of robotic platforms, such as a robotic high-velocity gantry system carrying an artificial visual system, a wheeled platform, and a six-legged insect-inspired robotic platform (in cooperation with the other CITEC groups).

The application of our models as control modules for mobile robots is not only interesting for robotics, but also serves as an important validation test of our experimentally established hypotheses.

 

Research issues that are currently being investigated comprise:

  • Implementation of a panoramic visual system with peripheral information processing and elementary motion detection
  • Analysing the significance of adaptive processes for the performance of the robot
  • Implementation of insect-inspired collision avoidance algorithms based on optic flow information
  • Implementation of insect-inspired visual navigation mechanisms
  • Testing the performance of these modules under well-defined lab conditions, but also under complex outdoor conditions at a wide range of natural light levels