ABSTRACT
This project proposes to revolutionize image-guided surgery (IGS) technology by tracking surgical modifications
in real time. IGS has profoundly changed modern endoscopic surgical techniques by allowing surgeons to
visualize pathologies that are not visible by endoscopy, to direct surgical instruments through the patient's
anatomy. However, revcurrent IGS is a static mechanism and can neither dynamically track surgery nor update
IGS reference images by CT or MRI in real time. The reference images easily deviate from the actual anatomy
and pathology during surgery, and surgeons must rely on memory to ensure the correctness and completeness of
the resection. This is more pronounced when the surgery covers a big area or involves soft tissue. In endoscopic
sinus and skull base surgery (ESSBS), for example, the surgery involves the wide sinus and skull base area,
which is a honeycomb-shaped structure surrounded by dense blood vessels and nerves. revDriven by ensuring the
safety of the patient during the operation, there is a reluctance to avoid more effective but riskier resections.
This results in a currently accepted revision rate of ESSBS of up to 28%. By real-time tracking and guiding,
the precision of surgery would be dramatically increased, leading to far greater success in surgery, with a vastly
reduced percentage of required revision surgery.
This design approach will focus on the ability to track modification and update images in real-time through
system optimization and advanced algorithms. revThe project will use a novel deep causal learning (DCL)
framework to break the bottleneck that results from the current unpredictable adverse factors in operating
rooms. The framework will use invertible nets to learn from data and clinical knowledge and identify, isolate,
and compensate for the adverse factors, leading to a realistic solution to this problem. A research team has been
assembled including surgeons, and computer scientists, to identify the limitations and challenges to formulate
a research plan to redesign the utility of IGS information to track surgery in real-time. The team will rely on
years of previous collaboration together to verify the feasibility of motion-based modification tracking. More
specifically, this project will use motion and endoscopic videos (Aims 1 and 2) to independently track surgical
modifications. The two methods are based on the DCL framework, and the tracking results not only have
accuracy and estimated confidence but will include the impacts of adverse factors as well. Moreover, the project
will fuse the two methods and verify clinical usability in the operating room environment (Aim 3).
revThis project will achieve reliable real-time updates of IGS in real clinical environments, and directly improve
the reliability of ESSBS surgical navigation. The project will pioneer tracking of surgical modification when both
deformation and modification are present, and extend the application of IGS to a broader array of endoscopic
surgeries. The use of causal learning to solve the medical modeling problem will be explored, thus paving the
way for using current and prior medical knowledge to formulate critical new learning of IGS procedures.