An important theme in system neuroscience is to understand the fundamental principle by which brain cells
work together to give rise to behaviors. Neuroscience experiments often require in vivo neural recordings at
optimized speed, as neural computations occur on short timescales and are distributed over large tissue
volumes. Optical imaging in awake animals, expressing genetically encoded calcium, voltage, or chemical
indicators, has become an essential tool in neuroscience. For decades, there has been a longstanding quest to
develop imaging tools that can record large-scale volumetric neural activity in behaving animals. Of particular
interest is to investigate the functional dynamics of astrocytes. Accumulating evidence has shown that
astrocytes, the most numerous glial cells, have active roles in many neurological diseases and
neuropsychiatric disorders. The patterns of astrocytic activity and their relationships with behaviors in health
and disease conditions remain largely unclear. In particular, astrocyte imaging in the brain of freely behaving
animals has not yet been explored, due to the lack of miniaturized high-speed volumetric imaging system.
Recent progress in light-sheet fluorescence microscopy (LSFM) has shed light on this topic. At present,
however, LSFMs are not suitable for high-speed volumetric imaging in freely behaving animals due to several
key limitations, including physical constraints of LSFM instruments, and light scattering and absorption in deep
tissue. The goal of this proposal is to address these technical barriers and transform LSFM into a novel
imaging tool for astrocyte imaging in freely behaving animals. To achieve our research goal, our vision sets
forth a transformative machine learning-augmented miniature adaptive optics light-sheet microscopy platform,
named as miniature Selective Plane Illumination Microscopy with following underlying ideas. First, we will
leverage a fiber bundle-coupled system design to address the physical constraints of LSFM in order to image
in freely behaving mice. We will engineer a miniature planar illumination scan head and use a coherent fiber
bundle to couple the scan head to the main optical system, which will be integrated into a compact form so that
it can be carried by a motorized rotatory stage to accommodate animal’s movement. Second, we will develop a
hybrid adaptive optics (AO) - deep learning approach to enhance the imaging performance in the deep tissue.
The AO module will be built on recent progress on direct wavefront sensing using an extended-source
wavefront sensor, while Monte Carlo based tissue optics model will be built with detailed neuroanatomy
information in order to generate realistic simulated image dataset for deep learning-based image processing.
Finally, we will consider widely-used experiments in neuroscience field to validate our system performance to
ensure that our system will be easily adopted by the neuroscience community.