Project Summary
Disorders of attention, social, and communicative functioning have become a significant public health
concern yet we lack a systematic data base characterizing the typical development of basic building blocks that
support optimal outcomes. We developed the first two individual difference measures assessing these building
blocks for infants and children. We now propose to implement the measures in 13 research labs to collectively
build a well-planned, large-scale, shared database, mine the database to advance knowledge and theory in
developmental science, and, at the same time, forge a new model for collaborative research.
Faces, voices, and infant-directed speech are highly salient to typically developing infants and scaffold
language, social, and cognitive development. In contrast, children with autism show social-orienting
impairments and deficits in social-communicative functioning. These capabilities depend on early multisensory
attention skills (integrating and attending to information across the senses) particularly in dynamically changing
faces, voices, and audiovisual speech. Without measures to assess individual differences in these fundamental
skills, the pathways by which they affect later language, social, and cognitive development remain poorly
understood. Our new protocols assess individual differences in attention maintenance, disengagement, and
speed, and accuracy of intersensory processing for audiovisual social and nonsocial events in preverbal
children. Preliminary findings from my current R01 reveal exciting relations between these multisensory skills
and language and social outcomes in 3-36-month-olds. The present proposal builds on this foundation. The
Multisensory Data Network brings together 13 experts in developmental science. We will 1) integrate our new
protocols into each of their research labs, and 2) implement an overall data collection plan, with each lab
testing participants using both protocols along with standardized language and social outcomes. This will
create a shared database of more than 1600 3-60 month old children. 3) Data will be uploaded to Databrary,
an online data-sharing library. 4) We will guide and standardize data collection across sites and create a
multisite aggregate dataset for easy sharing across investigators. 5) Capitalizing on advantages of large
datasets, we will derive the first preliminary norms for multisensory attention skills across 3-60 months and,
using cutting-edge, SEM-based analyses, develop models characterizing developmental cascades from
multisensory attention skills to more complex language and social capabilities that rely on this foundation. 6)
We then develop portable protocols, expanding potential applications to classroom and home settings. This
project will provide the first dataset of multisensory attention skills and effects on later outcomes across the first
5 years of life, with implications for identifying atypical trajectories and infants at risk for delays and guiding
interventions. This collaborative model will catalyze new research directions and is more time efficient, cost
effective, and will generate a larger, more diverse dataset than possible in individual research labs.