PROJECT SUMMARY
Studying how human brain activity gives rise to mental states can reveal the neural mechanisms of emotional
functioning and provide novel neural-physiological markers to enable personalized therapies for diverse mental
disorders. For brain monitoring alone, intracranial EEG (iEEG) can measure multi-region multiday brain activity
with high temporal resolution. However, the above goals hinge upon the ability for simultaneous brain-behavior
monitoring, which remains immensely difficult for mental states due to challenges on the physiology, behavior,
machine learning, and ethics fronts. First, physiological monitoring beyond a single modality – e.g., electrodermal
vs. cortisol – is not possible with current wearables and the demonstrated wearables do not measure cortisol.
Second, behavioral monitoring during intracranial recordings is largely limited to self-reports, which are sparse.
Also, while social processes are a major trans-diagnostic domain of emotional functioning in NIMH’s RDoC
framework and adversely affected in diverse mental disorders, they are largely absent in current brain-behavior
monitoring, which does not afford systematic scalable measurement of mental states during social interactions.
Third, modeling of concurrent neural-physiological-behavioral data introduces a machine learning challenge,
involving many modalities, nonlinearity, and mixed behaviorally relevant and irrelevant dynamics to dissociate.
Finally, there are ethical issues. We build an interdisciplinary team of engineers, psychiatrists and behavioral
scientists, computer scientists, neurosurgeons, neuroscientists, and neuroethicists to address these challenges.
We will develop novel software and hardware tools to enable multimodal neural-physiological-behavioral
sensing and machine learning for mental states within social processes and beyond. The R61 in years 1-
4 will develop and validate the tools in healthy subjects (Aims 1,2) and in epilepsy patients with already-implanted
iEEG electrodes which cover many regions related to mental states (Aim 3). In R61, we develop i) an integrated
wearable skin-like sensor for multimodal physiological, biomechanical, and cortisol sensing; ii) conversational
virtual humans to evoke naturalistic social processes and enable emotion recognition using multimodal audio-
visual-language modalities; and iii) a nonlinear, multimodal, brain-behavior modeling, learning, and inference
framework for mental states. We will also study the ethics of multimodal data collection, mental privacy, and self-
trust. Once the R61 tools are validated, we will combine them with intracranial brain activity in epilepsy patients
in R33 in year 5 to learn multimodal biomarkers of mental states. Our approach spans multiple RDoC systems
including Negative Valence, Arousal and Regulatory Systems, and Social Processes. It enables several levels
of analysis including Circuits, Physiology, Behavior, and Self-Report. These systems span diverse disorders
such as anxiety and depression. Thus, our multimodal, convergent, and integrated approach will likely enable
unique brain-behavior insights into human emotion functioning applicable to broad domains of mental health.