Learning to Estimate Palpation Forces in Robotic Surgery from Visual-Inertial Data

Young Eun Lee, Haliza Mat Husin, Maria Paola Forte, Seong Whan Lee, Katherine J. Kuchenbecker

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Surgeons cannot directly touch the patient's tissue in robot-assisted minimally invasive procedures. Instead, they must palpate using instruments inserted into the body through trocars. This way of operating largely prevents surgeons from using haptic cues to localize visually undetectable structures such as tumors and blood vessels, motivating research on direct and indirect force sensing. We propose an indirect force-sensing method that combines monocular images of the operating field with measurements from IMUs attached externally to the instrument shafts. Our method is thus suitable for various robotic surgery systems as well as laparoscopic surgery. We collected a new dataset using a da Vinci Si robot, a force sensor, and four different phantom tissue samples. The dataset includes 230 one-minute-long recordings of repeated bimanual palpation tasks performed by four lay operators. We evaluated several network architectures and investigated the role of the network inputs. Using the DenseNet vision model and including inertial data best-predicted palpation forces (lowest average root-mean-square error and highest average coefficient of determination). Ablation studies revealed that video frames carry significantly more information than inertial signals. Finally, we demonstrated the model's ability to generalize to unseen tissue and predict shear contact forces.

Original languageEnglish
Pages (from-to)496-506
Number of pages11
JournalIEEE Transactions on Medical Robotics and Bionics
Volume5
Issue number3
DOIs
Publication statusPublished - 2023 Aug 1

Bibliographical note

Funding Information:
This work was supported by the Max Planck Society and three Institute of Information and Communications Technology Planning and Evaluation (IITP) Grants funded by the Korean Government through the Global Internship Program for Developing Human Resources in Artificial Intelligence under Grant 2019-0-01605, through the Artificial Intelligence Graduate School Program, Korea University under Grant 2019-0-00079, and through the Artificial Intelligence Innovation Hub under Grant 2021-0-02068.

Publisher Copyright:
© 2018 IEEE.

Keywords

  • deep learning
  • Force estimation
  • indirect force sensing
  • robot-assisted minimally invasive surgery
  • visual-inertial input

ASJC Scopus subject areas

  • Biomedical Engineering
  • Human-Computer Interaction
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Learning to Estimate Palpation Forces in Robotic Surgery from Visual-Inertial Data'. Together they form a unique fingerprint.

Cite this