Representation and integration of diverse visual features in circuits and behavior - PROJECT SUMMARY Visual perception and visually-guided behavior relies on the ability of the brain to combine information about multiple aspects of a visual scene. In all sighted animals, including humans, the visual system uses parallel processing strategies to represent different features of a visual scene in different visual pathways or populations of neurons. How, then, does the brain combine information about these different aspects of the visual world into a unified visual percept, and how do animals guide behavior based on multiple visual cues that are encoded by separate pathways? The research performed under this award will answer this question using the visual system of the fruit fly Drosophila as a model, which allows for the identification and targeting of precise, defined neural populations that represent visual cues. This work will: (1) Determine how diverse visual features are jointly represented by populations of visual projection neurons that carry information from the visual system to the rest of the brain, (2) Determine how interactions among these visual projection pathways shape the visual information encoded in them, and (3) Determine how different features of a scene are combined to guide animal behavior. This work will advance the core mission of the National Eye Institute by making contributions to the understanding of visual function. This research will rely on a wide array of modern approaches in neuroscience, including functional imaging of brain activity, genetic manipulations, computational and statistical modeling, and behavioral measurements. In addition to the research described above, this award will support the awardee’s transition to an independent research career at a research-intensive university and further support the establishment of their visual neuroscience laboratory.