Overall research goals

Our overall research goal is to elucidate computational principles that allow insects, such as flies and bees, to autonomously generate their virtuosic visually guided behaviour and to adapt this behaviour to the demands of complex cluttered and often unpredictable environments.

Autonomous behaviour implies a range of very fundamental capabilities that may well generalize across species and are even relevant for autonomous artificial systems: Animals should be able to stabilize an intended course against external disturbances, but also internal asymmetries. They should be able to detect behaviourally relevant objects that may either be approached or need to be avoided, depending on the behavioural context. In both cases, animals need to identify objects in the environment as being behaviourally relevant, to take decisions between several behavioural alternatives, and to perform appropriate actions (Visually guided orientation behaviour in complex environments). Moreover, in many contexts it may be highly relevant to be able to find a goal or to return to behaviourally relevant locations - often over large distances -, such as the animal’s nest (Spatial learning and navigation). Solving such tasks requires the animal to gain, process and potentially learn and retrieve specific information about its environment and, especially, about its spatial layout.

To accomplish their extraordinary performance in spatial vision tasks, flies and bees have been shown to shape the dynamics of the image flow on their eyes (“optic flow”) by their characteristic behavioural actions. The neural processing of information about the spatial layout of the environment is greatly facilitated in this way, especially because during flight the rotational and translational optic flow components are largely segregated through a saccadic flight and gaze strategy. Due to the fact that spatial information is only contained in the translational optic flow component, this active vision strategy enables the nervous system to solve apparently complex spatial vision tasks in an efficient and parsimonious way. Hence, insects, such as flies or bees, appear to acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. Here, adaptive processes of a wide range of complexity play a key role. They operate on a wide range of timescales from milliseconds to hours and comprise adaptive changes in the sensory system (Adaptive mechanisms underlying visual motion processing) as well as genuine learning processes (Spatial learning and local navigation), depending on the behavioural task. In this way, even animals with tiny brains, such as insects, are capable of performing extraordinarily well by making optimal use of the closed action–perception loop.

Model simulations, which are constituent elements of most of our projects and robotic implementations, usually conducted in cooperation with other CITEC groups, show that the smart biological mechanisms of motion computation and visually guided orientation behaviour might also be helpful when designing technical systems (Insect-inspired robotics).

 

Approaches to our research goals

For achieving our research goals we apply quantitative behavioural approaches, such as high-speed cinematography of unrestrained behaviour or walking simulators in virtual reality environments. Moreover, a variety of electrophysiological approaches are employed in order to resolve the underlying neuronal mechanisms. For visual stimulation we use artificial stimuli, but also image sequences that reflect what animals have seen during unrestrained locomotion in different behavioural situations (‘ego-perspective movies’). This enables us to directly interpret computational properties of neurons and neuronal networks in a behavioural context. Our experimental analysis is complemented by computational modelling and by implementing biological principles of information processing in robots.