The Role of Artificial Intelligence in Colonoscopy

Recent advancements in the field of artificial intelligence (AI) have transformed the landscape of modern medicine 1,2. While the concept of AI may initially seem complex, AI can be understood as computer technologies designed to replicate human cognitive functions, including reasoning and adaptation, thereby mimicking human intelligence. Machine learning, a subset of AI, employs neural networks to autonomously learn from data without the need for explicit human guidance or supervision (Figure 1) 3, 4, 5. Similar to the complex neuronal synapses in the human brain, networks of artificial neurons in computer systems analyze complex datasets to discern and predict meaningful patterns and regularities within the data. For instance, convolutional neural networks (CNNs), a type of machine learning, involve using layers of neural networks to analyze and extract important features from image or video-based data 3. Given the incredible data processing speed of modern computers, AI systems can be a valuable tool for clinicians to process and interpret the large and complex data inherent in modern medicine.

While the application of AI in gastroenterology is broad, one of the most mature applications is AI assistance in colonoscopy. AI-assisted computer-aided detection (CADe) and diagnosis (CADx) systems, especially those using deep-learning techniques, are promising options to improve detection and characterization, decrease human variation through the ability to process high-dimensional endoscopic data, and to self-identify trainable parameters not appreciable by humans. The AI-assisted colonoscopy has been shown to increase the ADR, reduce the withdrawal time, and improve endoscopists’ optical biopsy, while reducing the time to make a diagnosis. Other emerging applications include prediction of tumor depth of invasion, quality assurance, and several applications in inflammatory bowel disease (IBD). In this review, we summarize the recent literature on the application of AI in Colonoscopy, and review the clinical implementation, current limitations of existing AI technologies, and future directions for this field.

Colorectal cancer (CRC) is the third most common cancer and second most common cause of cancer deaths worldwide6. Colonoscopy reduces the risk of CRC through detection and resection of pre-cancerous lesions such as adenomas7. The ability to detect adenomas during colonoscopy is greatly operator dependent, with studies reporting a wide adenoma detection rate (ADR) range of 7% to 53% among different endoscopists8. Failure to detect and remove neoplastic lesions is associated with the development of interval CRC, which accounts for nearly 10% of all diagnosed CRC9. As such, ADR has become a critical quality benchmark for an effective screening colonoscopy. Yet, despite the advancements in high-definition endoscopes, there are large variations in individual endoscopist's ADR which may partially explain the high incidence of interval CRC. Hence, AI integration into colonoscopy has been proposed as a solution to improve ADR and reduce the incidence of interval CRC .

Computer-aided detection (CADe) is an AI-technology based out of machine learning and CNN that allows computer assisted detection of polyp lesions during endoscopy. CADe serves as a “second pair of eyes” and alerts the endoscopists of potential polyps, typically by highlighting the lesion with a bounding box (Figure 2). By doing so, CADe can help improve adenoma or polyp detection rates and reduce the risk of missed lesions. Since the first demonstration of CADe in early 2000s, several randomized control trials (RCT) have validated the benefit of AI integration with improved ADR (table 1) 10, 11, 12, 13, 14, 15, 16, 17, 18, 19. In a prospective RCT evaluating use of CADe on ADR, the CADe-DB trial, Wang et al. demonstrated a significantly higher ADR in the CADe group compared to the sham group (29.1% vs 20.3%, p < 0.001) 11. The benefit of CADe in improving ADR is replicated even in centers with expert endoscopists with high baseline ADR. In a multi-center RCT conducted in Italy, Repici et al. published efficacy of real-time CADe in detecting adenomas in expert centers with high baseline ADR. Despite the high baseline ADR of 40.4% in the control group, ADR was still significantly higher when using CADe compared to standard white light colonoscopy (54.8% vs 40.4%, p < 0.001) 13. The benefit in improved ADR has been validated in meta-analyses as well 20. In a meta-analysis of 5 RCTs by Hassan et al., CADe showed a significantly higher ADR compared to standard white light colonoscopy (relative risk 1.44, 95% CI 1.27-1.62, p < 0.01) 20.

Aside from AI-integration, a number of different innovations (i.e. magnified endoscopy, dye-spray endoscopy, virtual chromoendoscopy, or mucosal exposure tools) have been developed with the hopes of improving ADR. While there are no head-to-head trials comparing effectiveness of CADe against these tools, an indirect comparison has been performed by Spadaccini et al. using a network meta-analysis. In their analysis, CADe had superior performance in ADR compared to chromoendoscopy (OR 1.45, 95% CI 1.14-1.85) and mucosal exposure systems (OR 1.54, 95% CI 1.22-1.94) 21. More importantly, CADe could also be further refined to work synergistically with other modalities to provide even better performance 22. A prospective RCT assessed the effect of combining CADe and endoCuff (mucosal exposure device) on the polyp detection rate in screening colonoscopy. The combination of CADe and endoCuff significantly improved ADR and advanced adenoma detection rate (AADR) over standard colonoscopy. While use of CADe or endoCuff increased the adenoma detection rate, they did not lead to increased detection of advanced adenoma unless used in combination22. Their findings suggest that CADe can work in conjunction with mucosal exposure devices with potential to improve effectiveness of screening colonoscopy.

CADx refers to computer assisted optical diagnosis of colorectal polyps as either neoplastic or non-neoplastic lesions using advanced image analysis algorithms (Figure 3). Although CRC develops from neoplastic adenomatous polyps, many small (≤5mm) colorectal polyps are not neoplastic and do not need to be removed. However, reliably distinguishing non-neoplastic and neoplastic polyps using endoscopic evaluation without histological examination - ‘optical biopsy’ - is difficult for many endoscopists, and hence the current practice is to remove all polyps seen for histological analysis.

The ability to distinguish a neoplastic and non-neoplastic lesion accurately without histological analysis is important as it could allow institutions to adopt cost-effective strategies such as ‘resect-and-discard’ and ‘diagnose-and-leave’ during colonoscopy 23, 24, 25, 26, 27. These approaches would require a system with the ability to provide optical classification of diminutive polyps as either neoplastic or non-neoplastic with high confidence. The “Preservation and Incorporation of Valuable endoscopic Innovations (PIVI)” guidelines published by the American Society for Gastrointestinal Endoscopy (ASGE) recommended a diagnostic threshold of negative predictive value (NPV) of adenomatous histology > 90% for ‘diagnose-and-leaving’ suspected rectosigmoid hyperplastic polyps, and > 90% agreement with histopathology for ‘resect-and-discarding’ diminutive colonic adenomas 28. A number of optical biopsy tools and systems (i.e. blue laser imaging, Kudo pit pattern) have been developed to satisfy PIVI criteria; however, clinical application of these tools and systems has been limited due to (1) the need for additional tools and equipment, and (2) the additional experience and training for operators that is necessary. Hence, CADx has become a promising potential solution as it can provide optical diagnosis, with its performance not affected by the experience of the endoscopists. Table 2 summarizes the studies on CADx for characterization of colorectal polyps.

CADx using narrow band imaging (NBI), a type of virtual chromoendoscopy, is the most extensively studied AI polyp characterization system to date. Initially, CADx systems were designed for magnified NBI endoscopy. One of the earlier pilot studies in CADx by Tischendorf et al. demonstrated that CADx could discriminate neoplastic and non-neoplastic colorectal lesions using magnified NBI images with an impressive sensitivity of 90% and specificity of 70.2% 29. The performance of CADx using magnified NBI images was replicated in other studies; however, the performance of CADx in these early studies was still inferior to expert endoscopists, and relied on magnified NBI images which limited their widespread clinical adoption 29, 30, 31, 32, 33. Since then, considerable progress has been seen in CADx with higher diagnostic accuracy and faster processing times. In a retrospective study using an unaltered video of non-magnified NBI colonoscopy, Byrne et al. demonstrated that a deep-learning based CADx could accurately differentiate diminutive adenomas from hyperplastic polyps in real-time. The accuracy was 94% (95% CI 86%-97%), sensitivity 98% (95% CI 92%-100%), and specificity 83% (95% CI 67%-93%) for identifying adenoma with a processing time of only 50ms 34. Their findings importantly demonstrated that a CADx system reached the ASGE PIVI optical biopsy threshold in real-time clinical practice and with regular magnification colonoscopes. Kominami et al. evaluated a CADx system in real-time using magnified NBI endoscopy in a study of 41 patients and 118 colorectal lesions. The CADx system achieved an accuracy of 94.9% and NPV of 93.3% in differentiating adenomas from non-adenomas in real-time. Recommendations for follow-up colonoscopy based on pathology and the CADx system were identical for 92.7% of patients 32. In a recent retrospective study, Zachariah et al. designed a CADx incorporating both white-light imaging (WLI) and NBI with a diagnostic capability surpassing the ASGE PIVI thresholds, with NPV and accuracy rates of 92.6% and 93.6%, respectively 35. Their findings demonstrated that CADx could reliably classify diminutive polyps, irrespective of the endoscopists’ experience and NBI usage, potentially benefiting community endoscopists.

Jin et al. designed a CNN based CADx using NBI images of diminutive polyps, and investigated the effect of AI assistance on 22 endoscopists of varying expertise in predicting polyp histology. The CADx system demonstrated an accuracy of 86.7%, sensitivity of 83.3%, and specificity of 91.7% - significantly higher than novice endoscopists. Importantly, AI assistance resulted in increased overall accuracy of the endoscopists from 82.5% to 88.5% 36. Similar findings were also noted in a study by Song et al. where an assistance by CADx as a support tool resulted in significant improvement in diagnostic performance of endoscopy trainees 37.

CADx has been validated using different digital imaging enhancement modalities. Pu et al. evaluated their CADx system's ability to characterize a colorectal lesion as hyperplastic, low-grade adenoma, sessile serrated adenoma, high-risk adenoma, and invasive cancer using NBI and blue-light imaging (BLI) endoscopic images. Validation across two external polyp image data sets using NBI and BLI showed a mean area under the curve (AUC) of 84.5% and 90.3% for the external sets (NBI and BLI, respectively) 38.

Endocytoscopy (EC) is a novel diagnostic addition to endoscopy, providing up to 1400-fold magnification which allows for real-time visualization of microscopic cellular structures. The ability to provide in-vivo histological structure of colonic lesions makes EC an ideal imaging modality to build a CADx system. In 2015, Mori et al. published the first results of a CADx system developed for EC. On a test set of 176 small colorectal polyps, their EC-CAD system had an accuracy, sensitivity, and specificity of 89.2%, 92.0%, and 79.5%, respectively. The diagnostic performance of the CADx system was significantly higher than those achieved by trainee endoscopists and comparable to those of expert endoscopists 39. In another retrospective study in Japan, Misawa et al. developed a CADx system using NBI-EC. Using 100 NBI-EC images as test sets, the overall accuracy for adenomatous lesions was 90.0%, with a sensitivity and specificity of 84.5% and 97.6%. If the analysis of CADx performance is limited to “high-confidence” diagnosis, when the CADx system output confidence is >90%, it had an outstanding accuracy, sensitivity, and specificity of 96.9%, 97.6%, and 95.8%, respectively 40. In a prospective study, Mori et al. evaluated the diagnostic accuracy of real-time CADx with EC using 466 diminutive polyps. The CADx system showed an NPV of 93.7% in assessment of diminutive rectosigmoid polyps meeting the performance threshold of the ASGE PIVI for implementing a 'diagnose-and-leave’ strategy.

CADx for standard WLI endoscopy has been explored as a tool to assist endoscopists. The diagnostic performance of WLI-CADx in earlier studies was lackluster. In an early study by Komeda et al., the diagnostic accuracy of CADx was 75.1% 41. The CADx diagnosis of a polyp as either adenoma or non-adenoma was correct in only 7 of 10 cases. Since then, a number of advancements in the diagnostic performance of WLI based CADx have been made. In a retrospective study by Sanchez-Montes et al, a high-definition WLI based CADx had an accuracy of 92.3% in differentiating neoplastic polyps from non-neoplastic polyps. The diagnostic performance of the CADx system was comparable to an expert endoscopist 42. In a recent prospective study, Hassan et al. evaluated the use of a commercially available CADx system using unmagnified WLI on 544 polyps and 162 patients. The NPV for neoplastic adenoma for ≤5-mm rectosigmoid lesions was 97.6% which was similar to optical evaluation by expert endoscopists using virtual chromoendoscopy (NPV 97.6%). With respect to ‘resect-and-discard’ performance, the surveillance interval recommended by CADx was concordant with histology-based recommendations in 95.9% of the patients 43. The study confirmed the accurate performance in predicting characteristics of diminutive polyps, with the NPV higher than the PIVI criteria.

The findings from CADx studies are clinically relevant because it makes it feasible to implement a number of cost-saving opportunities using CADx implementation. The optical biopsy performance of CADx is sufficiently high and meets the PIVI criteria required to implement a ‘resect-and-discard’ and ‘diagnose-and-leave’ approach. Based on the findings published by Hassan et al., CADx system could reduce the proportion of polypectomies and pathology costs by more than 80% if used to implement a ‘resect-and-discard’ strategy 43.

While the two concepts are often discussed and studied independently, the ideal CAD system should be a full workflow system that combines CADe and CADx. A CAD system that can detect polyps and simultaneously characterize the polyp would be a robust and practical tool for endoscopists. Mori et al. first demonstrated a novel AI system that enabled both polyp detection and characterization in real-time fashion using an endoscope with EC capability 44. Guizard et al. developed a full workflow CAD system which was able to detect and “track” polyps throughout the procedure. Though the detection performance was not discussed, the CAD system showed the ability to perform optical biopsy as well using NBI 45. Ozawa et al. provided the first quantitative analysis of a combined CAD workflow. They developed a CNN-based CAD for WLI and NBI that would detect and characterize a polyp simultaneously from a colonoscopy image with a processing time of 20 ms per frame. The AI system detected a colorectal polyp from an image with a sensitivity of 90% for WLI and 97% for NBI. Among the correctly detected polyps, the AI accurately classified 83% of polyps, and 97% of adenomas were precisely identified under WLI 46.

A more novel application of AI in colonoscopy is the prediction of depth of invasion in colorectal lesions. The prognosis and treatment of CRC varies based on the depth of tumor invasion. Endoscopic resection (ER) is a widely accepted treatment modality for early CRC limited to the mucosa (Tis) or those with superficial submucosal invasion (T1a, <1000µm). Lesions invading deeper into the submucosa (T1b, >1000µm) or muscularis propria (T2b) should undergo surgical resection and lymph node dissection due to the significant risk of lymph node metastasis 47,48. Therefore, reliable depth of invasion prediction can be an invaluable tool for endoscopists and surgeons for developing an optimal treatment strategy.

A number of systems have been developed to endoscopically predict the depth of lesion invasion. Classification systems using NBI, such as the Japan NBI Expert Team (JNET) classification and NBI International Colorectal Endoscopic (NICE) classification, use a lesion's color, vascular, and surface patterns to predict the depth of invasion. However, a drawback of NBI based assessments is that their performance is heavily operator dependent 41. Other methods of endoscopic evaluation include magnifying chromoendoscopy (MCE) and endoscopic ultrasound (EUS), and have shown accurate testing performance, but suffer from similar drawbacks 49, 50, 51. Using AI to generate computer prediction of depth of submucosal invasion from an endoscopic image represents an attractive option as it can be performed by non-expert endoscopists and can be done during the index endoscopy.

Initial studies evaluating AI assisted invasion depth were done using non-magnified white light endoscopic images. In 2020, Nakajima et al. developed an AI model using non-magnified white light endoscopic images of early CRC lesions (Tis to T1b). On external validation using 44 CRC lesions (23 Tis, 21 T1b) their AI system had a sensitivity, specificity, NPV, PPV and accuracy of 81%, 87%, 83%, 85%, and 84%, respectively, in differentiating T1b lesions 52. Though the diagnostic performance of the AI system was impressive, the performance was still lack luster and findings were limited by small training and validation sample sizes. Subsequent studies were performed using larger training sizes and validation samples. Luo et al. developed an AI model using 7734 white light colonoscopy images of 657 CRC lesions ranging from Tis to T2/3, and compared the AI's performance to that of experienced endoscopists. The AI model predicted endoscopically curable and incurable lesions with an accuracy of 91.1% (95% CI, 89.6%-92.4%), sensitivity 91.2% (95% CI, 88.8%-93.3%), specificity 91.0% (95% CI, 89.0%-92.7%), and AUC of 0.970 (95% CI, 0.962-0.978). There was no performance difference between AI and experienced endoscopists using white light and magnification chromoendoscopy. The same AI outperformed EUS staging by experts in accuracy, sensitivity, and specificity 53. Several studies from Korea, Japan, and China demonstrated similar findings with accurate performance of AI system 54, 55, 56.

A recently published meta-analysis summarizes the current state of CAD in predicting the invasion depth of early CRC. 10 studies from Korea, Japan, and China were reviewed and analyzed. CAD algorithms in Korea/Japan-based studies had pooled AUC, sensitivity, and specificity of 0.89 (95% CI 0.86–0.91), 62% (95% CI 50–72%), and 96% (95% CI 93–98%), respectively. CAD algorithms in China-based studies had pooled AUC, sensitivity, and specificity of 0.94 (95% CI 0.92–0.96), 88% (95% CI 78–94%), and 88% (95% CI 80–93%) 57. Testing positive on a CAD system would increase the post-test probability of the lesion being endoscopically unresectable to 83% and 68% respectively based on Japan/Korean or Chinese study data.

Since its first description a decade ago, third space endoscopy has revolutionized the field of therapeutic endoscopy. Third space endoscopy, also known as submucosal endoscopy, involves manipulation of the submucosal space to provide minimally invasive interventions. With recent advancements in devices and techniques, there are expanding roles of third space endoscopy. Endoscopic submucosal dissection (ESD) for early invasive gastrointestinal neoplasia and peroral endoscopic myotomy (POEM) for achalasia are well recognized examples of third space endoscopy 16. While ESD and POEM can be effective minimally invasive treatments, these procedures are complex with risk of procedural bleeding and perforation. Endoscopic resection within the submucosal space carries a risk of inadvertently transecting through the submucosal vessels or into the muscularis propria as proper anatomical structure identification can be difficult, especially for those less experienced 58. A computer assisted identification and delineation of vessels and tissue planes can be a mitigating strategy to intraprocedural risk during third space endoscopy. In a study by Ebigbo et al., an AI clinical decision support solution was trained on endoscopic still images of ESD and POEM to delineate important anatomical structures like vessels, muscularis, and submucosal layers. The mean cross-validated Intersection over Union and Dice Score were 63% and 76%, respectively. During video clips of third space endoscopic procedures, the AI system showed a mean vessel detection rate of 85% 59. The results of the study suggest a potential role of AI in optimizing complex endoscopic interventions rather than just for diagnosis.

While much of the focus on AI in endoscopy has been on endoscopic evaluation of colorectal polyps and cancers, the application of AI is also being actively explored in inflammatory bowel disease (IBD) workflow. IBD is a complex disease with multiple different disease factors affecting disease phenotype in individual patients. The complexity of the disease process makes IBD an ideal entity for AI and machine learning due to its ability to analyze complex patterns in large data sets 60.

A recent advancement in the use of AI in IBD is computer aided assessment of endoscopic disease activity (Table 3). Mucosal healing is a widely accepted treatment target in IBD. Endoscopic remission predicts favourable long-term outcomes and is a better predictor of disease control than symptom based assessment 61. There are a number of validated endoscopic scoring systems including the Mayo Endoscopic Subscore (MES), the Ulcerative Colitis Endoscopic Index of Severity (UCEIS), and the Paddington International virtual ChromoendoScopy ScOre (PICaSSO). However, these endoscopic scoring tools have major limitations including interobserver variability and subjectivity 62, 63, 64. Hence, automatic and more objective evaluation performed by AI during endoscopy can be an attractive tool for IBD monitoring.

In a prospective pilot study looking at the use of a computer system in predicting endoscopic and histologic remission for ulcerative colitis (UC), an AI system identified patients with endoscopic remission with 90.1% accuracy (95% CI 89.2%–90.9%) and histologic remission with 92.9% accuracy (95% CI 92.1%-93.7%) 65. Similarly, Yao et al. showed that a CAD system can distinguish between endoscopic remission (MES 0-1) and active disease (MES 2-3) in 83.7% of endoscopic videos of UC patients 66. Another AI algorithm using red pixel density of mucosal image has also been shown to provide an objective computer-based score that accurately assesses disease activity in UC with good correlation between red density, endoscopic and histological scores 67. Byrne et al. presented a first fully automated AI model for UC disease activity scoring for both MES and UCEIS using full endoscopic videos. The model evaluated disease activity at the section- and video-level with an accurate performance comparable to expert IBD endoscopists. The computer model also streamlined video reviewing and the scoring process via a user friendly interface while highlighting AI-generated key video segments and predicted activity score (Figure 4) 68. Recently, Iacucci et al. published a first study on the use of AI in UC activity evaluation using virtual chromoendoscopy (VCE). Their AI system detected endoscopic remission (PICaSSO ≤3) in VCE videos with 79% sensitivity, 95% specificity, and AUC of 0.94. The study was novel as the AI system was used to accurately predict endoscopic activity from VCE-based videos, but also because the system showed the ability to assess complete endoscopy footage rather than just frames 69.

Another potential application of AI in the management of IBD patients is IBD-dysplasia detection. CRC surveillance is an integral part of IBD management as chronic inflammation in IBD can lead to the development of dysplasia and CRC. Current guidelines recommend that surveillance colonoscopies should be performed using virtual or dye-based chromoendoscopy or non-targeted biopsies every 10 cm 70, 71, 72. Despite the development of high-definition endoscopes and chromoendoscopy techniques, dysplasia and early CRC in IBD are often difficult to detect as IBD-associated neoplasms are often flat with unclear boundaries. With the emergence of a number of CAD systems capable of real-time neoplasia detection, AI may provide a solution for detecting IBD-associated dysplasia 73. There is a report of successful low-grade dysplasia detection using an AI system in a patient with UC 74. This case is a proof-of-concept, suggesting the feasibility and clinical utility of an AI application in IBD-dysplasia detection. More recently, Coelho-Prabhu and colleagues have developed an AI tool that detected colorectal lesions in patients with IBD with an accuracy of nearly 97% 75.

An underappreciated value of AI incorporation in endoscopy is in computer-aided quality assurance (CAQ) and its impact in quality improvement of colonoscopies. There are a number of quality metrics in colonoscopy to ensure adequacy in optimal procedure performance. Aside from ADR, other indices including adequacy of bowel preparation, withdrawal time, and mucosal exposure are important. Yet adherence and reporting of quality indices in endoscopy is lacking 76,77. This is because many of the quality metrics are difficult to objectively measure and monitor in a busy endoscopy slate. AI as an automatic quality control system that assesses and monitors these metrics could be a practical solution to ensure optimal endoscopic evaluations 78.

ENDOANGEL is a deep-learning based computer system capable of performing real-time endoscopic quality assessment 19. The system was designed to be able to accurately measure withdrawal time, withdrawal speed, and detect endoscopic slipping to warn of potential blind spots. In 2020, Gong et al. published an RCT evaluating the efficacy of ENDOANGEL assisted colonoscopies compared to unassisted colonoscopies in 704 patients. Colonoscopies assisted by the ENDOANGEL system had a significantly higher ADR as well as mean withdrawal time compared to unassisted colonoscopies (withdrawal time of 6.55 minutes vs. 4.74 minutes; P < 0.0001; ADR of 16% vs. 8%; P < 0.001, respectively) 19. In another RCT, Su et al. developed an automatic quality control system (AQCS) using machine learning that supervises withdrawal speed and stability. 659 patients undergoing routine screening colonoscopies were randomized to either AQCS assisted colonoscopy or unassisted colonoscopy. Like the results noted from the ENDOANGEL study, withdrawal time as well as ADR were significantly higher in the AQCS group (withdrawal time of 7.03 minutes vs 5.68 minutes, P < 0.001; ADR of 0.289 vs 0.165, P < 0.001) 78. These studies corroborate our current understanding of the importance of adequate withdrawal time for optimizing ADR 19,78, 79, 80. In addition to increased ADR and withdrawal time, having a computer system automatically report the withdrawal time can lessen the cognitive burden on the endoscopist, and streamline endoscopic reporting documentation.

Assessment of adequacy of bowel preparation is an important quality metric in colonoscopy as it impacts both completeness of mucosal evaluation and, consequently, ADR and surveillance interval . Accurate bowel preparation reporting is also critical in individualizing future bowel preparation. A number of bowel preparation scoring systems have been developed such as the Boston Bowel Preparation Scale (BBPS) 81. However, the real-world application of bowel preparation assessments is limited due to subjectivity, lack of adherence, as well as reliance on human memory. An AI system can lessen this burden on the endoscopist by providing objective and reproducible assessment automatically. Already a number of studies using AI systems have demonstrated that AI systems can assess bowel preparation quality using validated scores like BBPS with high degree of accuracy 82, 83, 84, 85.

留言 (0)

沒有登入
gif