Open Access
ARTICLE
MPFracNet: A Deep Learning Algorithm for Metacarpophalangeal Fracture Detection with Varied Difficulties
1 College of Quality and Technical Supervision, Hebei University, Baoding, 071002, China
2 Hebei Technology Innovation Center for Lightweight of New Energy Vehicle Power System, Baoding, 071002, China
3 National & Local Joint Engineering Research Center of Metrology Instrument and System, Hebei University, Baoding, 071002, China
* Corresponding Author: Linyan Xue. Email:
Computers, Materials & Continua 2023, 75(1), 999-1015. https://doi.org/10.32604/cmc.2023.035777
Received 03 September 2022; Accepted 26 October 2022; Issue published 06 February 2023
Abstract
Due to small size and high occult, metacarpophalangeal fracture diagnosis displays a low accuracy in terms of fracture detection and location in X-ray images. To efficiently detect metacarpophalangeal fractures on X-ray images as the second opinion for radiologists, we proposed a novel one-stage neural network named MPFracNet based on RetinaNet. In MPFracNet, a deformable bottleneck block (DBB) was integrated into the bottleneck to better adapt to the geometric variation of the fractures. Furthermore, an integrated feature fusion module (IFFM) was employed to obtain more in-depth semantic and shallow detail features. Specifically, Focal Loss and Balanced L1 Loss were introduced to respectively attenuate the imbalance between positive and negative classes and the imbalance between detection and location tasks. We assessed the proposed model on the test set and achieved an AP of 80.4% for the metacarpophalangeal fracture detection. To estimate the detection performance for fractures with different difficulties, the proposed model was tested on the subsets of metacarpal, phalangeal and tiny fracture test sets and achieved APs of 82.7%, 78.5% and 74.9%, respectively. Our proposed framework has state-of-the-art performance for detecting metacarpophalangeal fractures, which has a strong potential application value in practical clinical environments.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.