Open Access
ARTICLE
Diagnosis of Various Skin Cancer Lesions Based on Fine-Tuned ResNet50 Deep Network
1 Department of Information Systems, College of Computers and Information Sciences, Jouf University, Sakaka, Saudi Arabia
2 Department of Information Systems, Faculty of Computers and Information, Mansoura University, Egypt
3 Department of Information Technology, Faculty of Computers and Information, Kafrelsheikh University, Egypt
4 Department of Computer Engineering and Networks, College of Computers and Information Sciences, Jouf University, Saudi Arabia
5 Department of Information Technology, Faculty of Computers and Information, Mansoura University, Mansoura, 35516, Egypt
* Corresponding Author: Mohammed Elmogy. Email:
Computers, Materials & Continua 2021, 68(1), 117-135. https://doi.org/10.32604/cmc.2021.016102
Received 23 December 2020; Accepted 24 January 2021; Issue published 22 March 2021
Abstract
With the massive success of deep networks, there have been significant efforts to analyze cancer diseases, especially skin cancer. For this purpose, this work investigates the capability of deep networks in diagnosing a variety of dermoscopic lesion images. This paper aims to develop and fine-tune a deep learning architecture to diagnose different skin cancer grades based on dermatoscopic images. Fine-tuning is a powerful method to obtain enhanced classification results by the customized pre-trained network. Regularization, batch normalization, and hyperparameter optimization are performed for fine-tuning the proposed deep network. The proposed fine-tuned ResNet50 model successfully classified 7-respective classes of dermoscopic lesions using the publicly available HAM10000 dataset. The developed deep model was compared against two powerful models, i.e., InceptionV3 and VGG16, using the Dice similarity coefficient (DSC) and the area under the curve (AUC). The evaluation results show that the proposed model achieved higher results than some recent and robust models.Keywords
Cite This Article
Citations
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.