Project Summary
Although close monitoring and dynamic assessment of patient acuity are key aspects of ICU care, both are
limited by the time constraints imposed on healthcare providers. Currently, dynamic and precise assessment of
patient’s acuity in ICU rely almost exclusively on physicians’ clinical judgment and vigilance. Furthermore,
important visual assessment details, such as facial expressions, posture, and mobility, are captured
sporadically by overburdened nurses or are not captured at all. However, these visual assessment details are
associated with critical indices such as physical function, pain and subsequent clinical deterioration. The PIs’
long-term goal is to sense, quantify, and communicate patient’s clinical condition in an autonomous and
precise manner. The overall objective of this application is to develop the novel tools for sensing, quantifying,
and communicating any patient’s condition in an autonomous, precise, and interpretable manner. The central
hypothesis is that deep learning models will be superior to existing acuity clinical scores by predicting acuity in
a dynamic, precise, and interpretable manner, using autonomous assessment of pain, emotional distress and
physical function, together with clinical and physiologic data. The hypothesis has been formulated based on
preliminary data and is well-grounded in clinical care literature. The rationale is that autonomous and precise
patient quantification can result in enhanced clinical workflow and early intervention. The overall objective will
be achieved by pursuing three specific aims. (1) Developing and validating an interpretable deep learning
algorithm for precise and dynamic prediction of the patient’s clinical status to determine if it is more accurate in
predicting daily care transition outcomes, while providing interpretable information to the physician. (2)
Developing a pervasive sensing system for autonomous visual assessment of critically ill patients to determine
if it can provide accurate visual assessment of a patient compared to human expert, and if it can enrich acuity
prediction when combined with clinical data. (3) Implementing and evaluating an intelligent platform for real-
time integration of autonomous visual assessment and acuity prediction in clinical workflow to determine
accuracy in real-time prospective evaluation and to determine physicians’ risk perception and satisfaction. The
approach is innovative, because it represents the first attempt to (1) dynamically predict precise patient
trajectory, (2) autonomously perform visual assessment in the ICU, and (3) implement artificial intelligence
platform in real time in clinical workflow. The proposed research is significant since it will address several key
problems and critical barriers in critical care, including (1) lack of precise and real-time prediction of clinical
trajectory, (2) manual repetitive ICU assessments, and (3) uncaptured patient aspects. Ultimately, the results
are expected to improve patient outcomes and decrease hospitalization costs, as well as lifelong
complications.