PROJECT SUMMARY
Brain-Computer Interfaces (BCIs) have achieved remarkable progress over the last decade, including the direct
control of sophisticated anthropomorphic robotic arms and the incorporation of tactile feedback. However, the
dexterity of current brain-controlled prosthetic limbs is limited in two important ways. First, most neuroprosthetic
control involves decoding kinematics from the responses of neurons in primary motor cortex (M1). While this
approach has been successful for controlling the proximal arm (shoulder and elbow) to place and orient the
hand, it is fundamentally inadequate for hand control and interactions with objects, which requires not only
orienting the wrist and shaping the digits but also applying appropriate forces. This problem is complicated by
the fact that force and kinematic signals as well as hand and arm signals are all intermingled in the neural
population activity in M1. Furthermore, hand and arm representations of force and kinematics seem to depend
on the task, as evidenced by the fact that decoders developed for one task fail to generalize to another. Second,
tactile feedback is critical to manual behavior as evidenced by the severe deficits that result from deafferentation.
To achieve dexterous control of a prosthetic arm thus also requires restoration of tactile feedback. One promising
approach is intracortical microstimulation (ICMS) of somatosensory cortex (S1), which evokes vivid tactile
percepts experienced on the (otherwise insensate) hand. There is a growing consensus that mimicking
naturalistic patterns of neuronal activation will lead to more natural tactile percepts and more dexterous hand
use. However, the neural basis of touch has been studied almost exclusively with stimuli passively presented to
the unmoving hand, which precludes any understanding of how motor behavior shapes S1 responses and
hinders the development of biomimetic encoding algorithms.
To fill these gaps, we will have NHPs perform prehensile behaviors in which we systematically vary hand and
arm kinematics and forces, and measure the time-varying postures of the entire limb and the forces exerted on
objects, including contact forces at each digit. We seek to characterize (1) signals in M1 relating to kinematics
and forces exerted by the arm and hand; (2) signals in S1 relating to active interactions with objects; and (3)
signals transferred between M1 and S1. We propose to apply well-established encoding and decoding
techniques to investigate the relationship between neural responses and movement parameters as well as a
novel dynamical systems analysis. The resulting insights into the neural mechanisms of prehension will lead to
(1) the development of decoders of intended limb state from M1 responses that include both kinematics and
force control and generalize across behavioral tasks; (2) biomimetic sensory encoding algorithms informed by
an understanding of active touch representations in S1. The research team is uniquely poised to test the resulting
decoders and sensory encoding algorithms in human BCI participants as part an ongoing clinical trial at both
sites through an ongoing NIH funded project.