Open Access
ARTICLE
Optimized Identification with Severity Factors of Gastric Cancer for Internet of Medical Things
1 Faculty of Computing, Universiti Teknologi Malaysia, Johor Bahru, 81310, Malaysia
2 Faculty of Science and Technology, Shaheed Benazir Bhutto University Sanghar Campus, Pakistan
3 Faculty of Computing and Information Technology, Sule Lamido University, Kafin Hausa, P.M.B.048, Nigeria
* Corresponding Author: Fatima Tul Zuhra. Email:
Computers, Materials & Continua 2023, 75(1), 785-798. https://doi.org/10.32604/cmc.2023.034540
Received 20 July 2022; Accepted 17 November 2022; Issue published 06 February 2023
Abstract
The Internet of Medical Things (IoMT) emerges with the vision of the Wireless Body Sensor Network (WBSN) to improve the health monitoring systems and has an enormous impact on the healthcare system for recognizing the levels of risk/severity factors (premature diagnosis, treatment, and supervision of chronic disease i.e., cancer) via wearable/electronic health sensor i.e., wireless endoscopic capsule. However, AI-assisted endoscopy plays a very significant role in the detection of gastric cancer. Convolutional Neural Network (CNN) has been widely used to diagnose gastric cancer based on various feature extraction models, consequently, limiting the identification and categorization performance in terms of cancerous stages and grades associated with each type of gastric cancer. This paper proposed an optimized AI-based approach to diagnose and assess the risk factor of gastric cancer based on its type, stage, and grade in the endoscopic images for smart healthcare applications. The proposed method is categorized into five phases such as image pre-processing, Four-Dimensional (4D) image conversion, image segmentation, K-Nearest Neighbour (K-NN) classification, and multi-grading and staging of image intensities. Moreover, the performance of the proposed method has experimented on two different datasets consisting of color and black and white endoscopic images. The simulation results verified that the proposed approach is capable of perceiving gastric cancer with 88.09% sensitivity, 95.77% specificity, and 96.55% overall accuracy respectively.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.