Tumors of the small bowel (SB) represent up to 6% of all tumors of the gastrointestinal tract (1). An endoscopic evaluation of this segment of the digestive tract has historically been technically difficult. The introduction of capsule endoscopy (CE) has simplified the investigation of patients with suspected SB disease, and it plays an active role in the workup of patients with clinical or imaging suspicion of SB tumors or in the monitoring of patients with inherited polyposis syndromes (2,3). Nevertheless, CE does not allow the performance of tissue sampling, resection of lesions, or the application of endoscopic therapy. Device-assisted enteroscopy (DAE) plays a pivotal role in the endoscopic investigation of SB disease, particularly after a positive CE (2,4).
Small bowel protruding lesions include a wide range of pleomorphic lesions. Most SB tumors are malignant lesions, including adenocarcinoma, neuroendocrine tumors, and lymphomas (5). Moreover, small patient series indicate that almost half of patients with malignant SB tumors may experience metastatic or unresectable disease, which strains the importance of an early diagnosis (6).
Over the past years, several studies have been conducted on the potential effect of artificial intelligence (AI) algorithms for automatic interpretation of endoscopic images, and regulatory agencies have recently approved the first system for use in clinical practice (7,8). Convolutional neural networks (CNNs) are a type of AI architecture–based deep learning algorithm tailored for image analysis. To date, most studies have focused on the development of AI solutions for colonoscopy, esophagogastroduodenoscopy, and CE (7,9,10). Indeed, the use of AI for the assistance of DAE examinations has been scarcely evaluated. In fact, a recent review that examined the available published data on various AI-based approaches applied to SB disease reported that data on the use of AI on enteroscopy are lacking (11). The aim of this proof-of-concept study was to develop a CNN-based algorithm for the automatic detection of protruding lesions in enteroscopy images.
METHODSWe retrospectively reviewed images from enteroscopy examinations of patients (n = 72) undergoing DAE (either small bowel enema or double-balloon enteroscopy) between January 2020 and May 2021 at a single tertiary center (São João University Hospital, Porto, Portugal). A total of 7,925 images of the SB were ultimately extracted after revision and consensus by 2 gastroenterologists with expertise in DAE (P.A. and H.C.). Any nonconsensual image was discarded and did not integrate the CNN. The study was approved by the ethics committee of São João University Hospital (No. CE 188/2021).
All procedures were performed using 2 DAE systems: double-balloon enteroscopy system Fujifilm EN-580T (n = 49) and the single-balloon enteroscopy system Olympus EVIS EXERA II SIF-Q180 (n = 23). Each extracted frame was analyzed for the presence of enteric protruding lesions, defined as any elevation of tissue above the gastrointestinal epithelium, including epithelial tumors and subepithelial lesions.
A CNN for automatic identification of enteric protruding lesions was developed. From the collected pool of images, 2,535 had evidence of protruding lesions and 5,390 showed normal mucosa. This pool of images was split into training (80%) and validation (20%) image data sets. The CNN was created using the Xception model with its weights trained on ImageNet. We used Tensorflow 2.3 and Keras libraries to prepare the data and run the model. The analyses were performed with a computer equipped with a 2.1 GHz Intel Xeon Gold 6130 processor (Intel, Santa Clara, CA) and a double NVIDIA QuadroRTX4000 graphic processing unit (NVIDIA Corp, Santa Clara, CA).
For each image, the CNN calculated the probability for each of category. A higher probability demonstrated a greater confidence in the CNN prediction; the category with the highest probability was outputted as the CNN's classification. The primary outcome measures included sensitivity, specificity, positive and negative predictive values, accuracy, and area under the receiver operating characteristic curve. Statistical analysis was performed using Sci-Kit learn v0.22.2 (7).
RESULTS Construction of the CNNA sum of 7,925 frames were extracted. The validation data set (20%) included 507 images containing protuberant lesions and 1,078 depicting normal mucosa. The accuracy of the CNN showed an increasing trend both in the training and validation stages. The output provided by the CNN is shown in Figure 1.
Figure 1.:Output obtained from the application of the convolutional neural network. The bars represent the probability estimated by the network, and the blue bars represent a correct prediction. N, Normal mucosa/other findings; PR, protuberant lesions.
Performance of the CNNThe distribution of results is summarized in Table 1. The sensitivity and specificity of the model were 97.0% and 97.4%, respectively. The positive and negative predictive values were 94.6% and 98.6%, respectively. The overall accuracy of the CNN was 97.3%. The area under the receiver operating characteristic curve for detection protuberant lesions was 1.00 (Figure 2).
Table 1. - Distribution of results Final diagnosis Protuberant lesions Normal mucosa CNN Protuberant lesions 492 28 Normal mucosa 15 1,050CNN, convolutional neural network.
Receiver operating characteristic analyses of the network's performance in the detection of protuberant lesions vs normal colonic mucosa/other findings. PR, protuberant lesions.
Computational performance of the CNNThe CNN completed the reading of the validation data set in 10 seconds, corresponding to an approximated reading rate of 157 frames/s.
DISCUSSIONThe examination of the SB is challenging. DAE has revolutionized and expanded the diagnostic and therapeutic possibilities within this segment. The application of AI to DAE is an unexplored field (11). DAE plays a central role in the investigation of patients with SB protruding lesions, either spontaneous or in the setting of hereditary polyposis syndromes. The implementation of AI models may allow the detection and characterization of these lesions. Our proof-of-concept algorithm demonstrated high accuracy in the detection of these SB lesions.
DAE is a rapidly evolving area of GI endoscopy and the application of AI in this area is currently in embryonic stages. The results from this study are in agreement with previous studies on the potential of deep learning algorithms for the detection of these enteric lesions in CE (12).
This study has several limitations. First, it is a retrospective and unicentric study. Second, the number of extracted frames was relatively small, limiting the generalization of our results. Third, we could not obtain histopathology data and thus, we could not provide a prediction of final histologic diagnosis and could not measure the performance of the network for distinct histologic types. Fourth, images were divided in a per-frame basis instead of a per-patient basis. Therefore, our results must be interpreted as preliminary and not conclusive regarding the performance of the network in a real-life setting. Finally, the model was built using still frames. Further studies with a larger number of patients using real-time videos are required to accurately assess the clinical value of these tools.
In conclusion, the potential for the expansion of AI to other areas of endoscopy, such as DAE, is vast. Our proof-of-concept model lays the foundations for the development of AI algorithms for automatic detection of bulging lesions in DAE examinations. Given the common presence of subtle lesions, this technology can be of greatest importance and can be of great value in improving detection rates and ultimately improve patient care.
CONFLICTS OF INTERESTGuarantor of the article: Miguel Mascarenhas Saraiva, MD, MSc.
Specific author contributions: PC and MMS: equal contribution in study design, revision of DAE videos, image extraction, construction, and development of the CNN, drafting of the manuscript; and critical revision of the manuscript. JA: construction and development of the CNN, bibliographic review, and critical revision of the manuscript. TR: bibliographic review, drafting of the manuscript, and critical revision of the manuscript. JF: construction and development of the CNN, statistical analysis, and critical revision of the manuscript. PA, HC, and GM: study design and critical revision of the manuscript. All authors approved the final version of the manuscript.
Financial support: The authors acknowledge Fundação para a Ciência e Tecnologia (FCT) for supporting the computational costs related to this study through CPCA/A0/7363/2020 grant. This entity had no role in the study design, data collection, data analysis, preparation of the manuscript, and publishing decision.
Potential competing interests: None to report.
REFERENCES 1. Rondonotti E, Koulaouzidis A, Yung DE, et al. Neoplastic diseases of the small bowel. Gastrointest Endosc Clin N Am 2017;27:93–112. 2. Pennazio M, Spada C, Eliakim R, et al. Small-bowel capsule endoscopy and device-assisted enteroscopy for diagnosis and treatment of small-bowel disorders: European society of gastrointestinal endoscopy (ESGE) clinical guideline. Endoscopy 2015;47:352–76. 3. Schwartz GD, Barkin JS. Small-bowel tumors detected by wireless capsule endoscopy. Dig Dis Sci 2007;52:1026–30. 4. Rondonotti E, Spada C, Adler S, et al. Small-bowel capsule endoscopy and device-assisted enteroscopy for diagnosis and treatment of small-bowel disorders: European society of gastrointestinal endoscopy (ESGE) technical review. Endoscopy 2018;50:423–46. 5. Bilimoria KY, Bentrem DJ, Wayne JD, et al. Small bowel cancer in the United States: Changes in epidemiology, treatment, and survival over the last 20 years. Ann Surg 2009;249:63–71. 6. Cardoso H, Rodrigues JT, Marques M, et al. Malignant small bowel tumors: Diagnosis, management and prognosis. Acta Med Port 2015;28:9. 7. Repici A, Badalamenti M, Maselli R, et al. Efficacy of real-time computer-aided detection of colorectal neoplasia in a randomized trial. Gastroenterology 2020;159:512–20.e7. 8. Le Berre C, Sandborn WJ, Aridhi S, et al. Application of artificial intelligence to gastroenterology and hepatology. Gastroenterology 2020;158:76–94.e2. 9. Wu L, Xu M, Jiang X, et al. Real-time artificial intelligence for detecting focal lesions and diagnosing neoplasms of the stomach by white-light endoscopy (with videos). Gastrointest Endosc 2022;95:269–80.e6. 10. Aoki T, Yamada A, Kato Y, et al. Automatic detection of various abnormalities in capsule endoscopy videos by a deep learning-based system: A multicenter study. Gastrointest Endosc 2021;93:165–73.e1. 11. Meher D, Gogoi M, Bharali P, et al. Artificial intelligence in small bowel endoscopy: Current perspectives and future directions. J Dig Endosc 2020;11:245–52. 12. Saito H, Aoki T, Aoyama K, et al. Automatic detection and classification of protruding lesions in wireless capsule endoscopy images based on a deep convolutional neural network. Gastrointest Endosc 2020;92:144–51.e1.
留言 (0)