PROJECT SUMMARY
Coronary artery disease (CAD) is the leading cause of death and disability in the US and globally. The epidemic
of obesity, diabetes, and cardiometabolic disease is changing the nature of CAD, with diffuse and microvascular
disease emerging as key drivers of adverse outcomes. Radionuclide myocardial perfusion imaging is the most
widely used modality for CAD assessment and is still primarily performed with SPECT. But SPECT evaluates
only relative perfusion and is inherently insensitive in the setting of diffuse or microvascular disease. PET, with
its unique ability to accurately quantify absolute myocardial blood flow, allows robust detection of obstructive
CAD, diffuse atherosclerosis, balanced ischemia, and coronary microvascular dysfunction. Cardiac PET is also
always obtained with additional chest CT for attenuation correction purposes. However, this modality requires a
high level of on-site technical expertise to maximize its broad capabilities.
We have applied highly efficient, image-based artificial intelligence (AI) approaches extensively to SPECT and
CT, demonstrating improved diagnostic accuracy and risk stratification. These tools can be harnessed to
enhance the utility of cardiac PET/CT. We propose to efficiently translate the latest AI advances and our recent
SPECT developments to fully automate cardiac PET/CT analysis, including novel tools for quality control, high-
performance image segmentation, new quantitative variables, and direct outcome prediction from images, using
PET/CT data from multiple centers.
The overall aim is to develop is to develop practical AI algorithms for comprehensive cardiac PET/CT analysis
and to validate them in a multi-center setting. For this work, we propose the following 3 specific aims: (1) To
develop and test automated end-to-end PET quantification, (2) To develop and test automated end-to-end chest
CT quantification, (3) To develop and validate explainable AI models for enhanced patient assessment from
images and clinical data, employing latest advances in survival analysis, supervised and unsupervised learning,
and knowledge transfer.
This research will result in personalized tools, which will improve the accuracy of patient assessment by PET/CT
beyond what is possible by the current practice of subjective interpretation and mental integration of diverse
data. Explainable methods combining image and clinical data to make AI conclusions more tangible will allow
clinical adoption of this technology. The new tools can dramatically simplify PET/CT protocols, reduce
subjectivity, reduce burden on the physicians, and maximize the information derived from the multimodal scans.
They will fit directly into existing workflows, facilitating deployment in diverse clinical settings. The new AI
methods for image analysis and explainable integration of multimodality data will generalize to other diseases
and problems in biomedical imaging.