Precise highlighting of the pancreas by semantic segmentation during robot-assisted gastrectomy: visual assistance with artificial intelligence for surgeons

Background

A postoperative pancreatic fistula (POPF) is a critical complication of radical gastrectomy for gastric cancer, mainly because surgeons occasionally misrecognize the pancreas and fat during lymphadenectomy. Therefore, this study aimed to develop an artificial intelligence (AI) system capable of identifying and highlighting the pancreas during robot-assisted gastrectomy.

Methods

A pancreas recognition algorithm was developed using HRNet, with 926 training images and 232 validation images extracted from 62 scenes of robot-assisted gastrectomy videos. During quantitative evaluation, the precision, recall, intersection over union (IoU), and Dice coefficients were calculated based on the surgeons’ ground truth and the AI-inferred image from 80 test images. During the qualitative evaluation, 10 surgeons answered two questions related to sensitivity and similarity for assessing clinical usefulness.

Results

The precision, recall, IoU, and Dice coefficients were 0.70, 0.59, 0.46, and 0.61, respectively. Regarding sensitivity, the average score for pancreas recognition by AI was 4.18 out of 5 points (1 = lowest recognition [less than 50%]; 5 = highest recognition [more than 90%]). Regarding similarity, only 54% of the AI-inferred images were correctly differentiated from the ground truth.

Conclusions

Our surgical AI system precisely highlighted the pancreas during robot-assisted gastrectomy at a level that was convincing to surgeons. This technology may prevent misrecognition of the pancreas by surgeons, thus leading to fewer POPFs.

留言 (0)

沒有登入
gif