In natural vision, objects change appearance over time as they translate, rotate, become occluded or undergo
complex transformations, e.g., during biological motion. In these dynamic environments, the visual cortex
integrates information over multiple spatial and temporal scales to compute motion trajectories and represent
the shape of objects. To understand how form and motion percepts are derived from such dynamic visual
input, we will investigate how neuronal responses in area V4—an intermediate stage along the ventral visual
pathway—are shaped by the spatiotemporal integration of time-varying visual stimuli. We will test the
hypothesis that the spatiotemporal characteristics of V4 neurons are suited for tracking dynamic objects:
specifically, V4 motion signals arise from object-tracking over longer spatiotemporal windows than comparable
dorsal-stream areas and that V4 signals reflect form transformations at an object level rather than at the level
of the local retinal image. We will leverage the percept of long-range apparent motion to probe the role of V4
neurons in motion perception (Aim 1). When a stimulus intermittently skips across the visual field with large
spatial and temporal steps, it induces a strong illusory motion percept but neurons in V1 and MT of the dorsal
visual stream are strikingly insensitive to the direction of the perceived motion. Psychophysical studies have
argued that long-range apparent motion relies not on the dorsal stream but on higher order object tracking
processes with large spatiotemporal windows in the ventral visual stream. We will conduct the first
neurophysiological investigations in the awake monkey to ascertain the role of V4 in the perception of long-
range apparent motion. Next (Aim 2), we will use dynamic stimuli that rotate in the fronto-parallel plane, and
translate and rotate in depth, to determine whether V4 neurons encode other common dynamic object
transformations (beyond long-range translation), and whether the encoding is based on a sequence of static
poses, as in the inferotemporal (IT) cortex, or dynamic transformations. Finally, we will examine the encoding
and perception of partially occluded dynamic objects (Aim 3). When an occluded object moves, different parts
of the object are revealed over time and integration across time and multiple neuronal receptive fields is
required to build an entire object representation. As animals discriminate moving occluded objects, we will
study 50-100 neurons with high-density Neuropixels probes. We will use single trial population decoding
methods to determine how dynamic stimulus information is integrated across the V4 network to extract object
shape and the motion trajectory and how V4 contributes to psychophysical behavior. We anticipate that our
results will reveal an important role for V4 in the processing of dynamic stimuli that is complementary to those
of MT and IT cortex and will establish the level of internal visual representation operating in V4. Our studies will
provide a deeper understanding of the neuronal basis of global motion perception and the tracking of dynamic
objects—processes that are impaired in aging populations, especially those with Alzheimer’s disease.