Straw Group Research

Neural circuits for vision

Circuit neuroscience has been revolutionized in the last few years by the use of genetics to express proteins such as channelrhodopsin or GCaMP in molecularly defined neuronal classes. Concurrently, a recent trend in cognitive science is to model the mind within a quantitative framework in which the purpose of perception, decision-making and learning is framed in the context of Bayesian inference. Visually guided behavior of the fruit fly is an ideal experimental system for linking these approaches to establish a mechanistic understanding of neuronal circuit function as well as the computational purpose of behavior. This is the focus of our work.

Engineering tools for quantitative behavior

Figure 1 (Click to view legend)

A major challenge in modern neuroscience is to understand how behavior emerges from the interplay of multiple physiological processes and an animal’s own environmental surroundings. As the circumstances under which animals evolve are very different from most laboratory experimental paradigms, it is difficult to relate the results of many behavioral assays to behaviors shaped by natural selection. This makes it difficult to reason about the purpose of a particular behavior. We address this problem by creating novel experimental devices that allow laboratory-level rigor in experiments performed under naturalistic conditions.

One set of our technical innovations consisted of a virtual reality system capable of simulating arbitrary visual scenes. By utilizing live animal tracking, a tethered or freely moving animal is placed in a virtual environment rendered physical by means of multiple computer displays. We made use of recent advances in computer graphics technology, such as GPU shaders, to achieve real-time luminance blending across multiple projection paths and onto arbitrarily shaped projection surfaces.

Mapping neural circuits for visual locomotor guidance in Drosophila

Figure 2 (Click to view legend)

In one set of related ongoing projects, we are performing genetic manipulations to silence small numbers of neurons and measuring behavioral deficits in response to precisely defined visual stimuli. Specificity down to single cell types is attained with recent genetic techniques. A variety of effector molecules that kill cells, hyperpolarize them, or block chemical synaptic transmission are used. A completely automated free-flight virtual reality arena allows us to perform visual psychophysics experiments in high throughput. A tethered flight apparatus enables us to measure and stimulate with high spatial and temporal resolution. As this setup is similar to the type required for electro- and opto-physiology, it permits us to make direct predictions for such work. The basic idea of these experiments is simple: to eliminate a group of cells and measure associated behavioral deficits.

How flies resolve conflicting visual information – cells and models

Several fundamental visual behavioral modules, such as wide-field rotational stabilization and object approach, shape a fly’s trajectory as it flies through its habitat. How do these modules interact? From basic principles it is clear that any organism should select one stimulus in those situations in which attempting to simultaneously respond to both would be counter-productive. We are now dissecting the algorithmic and neural basis of such situations and the resulting higher-level visual behaviors that necessarily arise as a fly travels through a more realistic natural environment. The neural substrate of these computations is being mapped with the strategy described above. We are interested not only in the relevant circuits, but also in the conceptual and algorithmic components of the problem.