[BACK]
images Computer Modeling in Engineering & Sciences images

DOI: 10.32604/cmes.2022.016065

ARTICLE

Thermogram Adaptive Efficient Model for Breast Cancer Detection Using Fractional Derivative Mask and Hybrid Feature Set in the IoT Environment

Ritam Sharma1, Janki Ballabh Sharma1, Ranjan Maheshwari1 and Praveen Agarwal2,3,4,5,*

1Rajasthan Technical University, Kota, Rajasthan, 324010, India
2Anand International College of Engineering, Jaipur, 303012, India
3Nonlinear Dynamics Research Center (NDRC), Ajman University, Ajman, 20550, United Arab Emirates
4International Center for Basic and Applied Sciences, Jaipur, 302029, India
5Institute of Mathematical Modeling, Almaty, 050000, Kazakhstan
*Corresponding Author: Praveen Agarwal. Email: goyal.praveen2011@gmail.com
Received: 03 February 2021; Accepted: 23 August 2021

Abstract: In this paper, a novel hybrid texture feature set and fractional derivative filter-based breast cancer detection model is introduced. This paper also introduces the application of a histogram of linear bipolar pattern features (HLBP) for breast thermogram classification. Initially, breast tissues are separated by masking operation and filtered by Gru¨mwald–Letnikov fractional derivative-based Sobel mask to enhance the texture and rectify the noise. A novel hybrid feature set using HLBP and other statistical feature sets is derived and reduced by principal component analysis. Radial basis function kernel-based support vector machine is employed for detecting the abnormality in the thermogram. The performance parameters are calculated using five-fold cross-validation scheme using MATLAB 2015a simulation software. The proposed model achieves the classification accuracy, sensitivity, specificity, and area under the curve of 94.44%, 95.55%, 92.22%, 96.11%, respectively. A comparative investigation of different texture features with respect to fractional order α to classify the breast malignancy is also presented. The proposed model is also compared with a few existing state-of-art schemes which verifies the efficacy of the model. Fractional order α offers extra adaptability in overcoming the limitations of thermal imaging techniques and assists radiologists in prior breast cancer detection. The proposed model is more generalized which can be used with different thermal image acquisition protocols and IoT based applications.

Keywords: Thermal image; breast cancer; fractional derivative mask; image texture analysis; feature extraction; radial basis function; machine learning

1  Introduction

Breast cancer has become a widely occuring disease among women and the reason of rapidly increasing death rate due to its late diagnosis [1]. Breast cancer is caused by a genetic mutation of Deoxyribonucleic acid (DNA) in the cells of breast tissues and these cells keep reproducing the same muted cells. These abnormal cells cluster together to form a tumor which becomes cancerous when these abnormal cells metastasize to rest of the body parts through the bloodstream or lymphatic system [2]. The most significant factors of developing breast cancer are advancing age and inheritance [3]. Therefore, early and accurate screening of breast cancer offers a major role in treating breast malignancy and reducing the mortality rate.

There are various screening techniques which aim at an early revelation of breast disease. These techniques depend on light, sound, heat, X-ray, nuclear, magnetism, microwave, and fusion of different methods. Among these techniques, digital mammography is believed to be the gold standard and widely used technique for tumor detection and classification [4]. But mammography shows low sensitivity (true positive) with high specificity (true negative) whereas Magnetic Resonance Imaging (MRI) reveals high sensitivity with reduced specificity for premature detection of breast cancers [5]. Also, the patients must bear intense pain during the process of mammography. Thus, the limitations of present screening and diagnostic modalities necessitate the development of an advance and more effective technique with higher sensitivity and specificity for premature stage breast cancer detection [6].

Thermography has immense potential for screening breast diseases as it has already been reported that breast disease can be detected decay prior to the conventional technique like mammography [2]. Thermography is an unobtrusive, contactless, painless, radiation-free, temperature screening imaging technique. It is being regarded as a consistent add-on tool nowadays with high sensitivity and specificity [5]. Most of the breast cancer screening techniques focus on finding the tumor or cancerous regions by detecting physical changes in cell structures, but, thermal imaging has the potential in finding the thermal disruption due to functional changes in the cells which helps in investigating the presence of pre-stage of early cancer [6,7]. It has already been reported that clinically healthy breast tissues have predictable and regular heat patterns on the skin surface while unhealthy breast tissues have irregular heat patterns due to physiological processes such as vascular disturbances and inflammation [13].

Thermal patterns emitted by human skin are recorded by a thermal camera and a heat signature is generated called the thermogram [5]. But the thermograms alone are not adequate for clinical experts to make an exact diagnosis, so some expository tools, for example, bio-measurable strategies, automation of the different steps involved in the procedure, artificial intelligence, or computer vision techniques are required to assist and analyze the thermograms objectively. In this regard, several computer-aided diagnosis schemes have been developed to detect the disease accurately [6].

Most of the computer-aided schemes reported in the literature have performed bilateral asymmetry analysis which limits the performance for the cases where malignancies closely resemble in both breasts [7]. The detection accuracy of such schemes is reliant on the difference between features of left and right breast tissues. Since thermograms are low-intensity images with small signal-to-noise ratio therefore the detection accuracy may be limited and the false negative detection rate may be higher [8,9]. However, many schemes reported in literature have also analyzed each breast separately to overcome the above limitation [10,11]. Such schemes may suffer from the problem of false positive error if the selection of feature and feature quality is not proper. Therefore, selection of features and feature quality play a vital role as one kind of feature may not suit other imaging modalities.

In thermogram-based breast cancer detection schemes reported in the literature, statistical, Gabor, HOG, etc., texture features have been exploited to improve the detection accuracy [915]. But, the comparative performance evaluation of popular texture feature sets is missing in the literature. Secondly, bias correction and image registration are required in thermograms due to misalignment and inconsistency in the acquisition process. Any inaccuracy in these operations directly affects the performance of cancer detection. Recently, a number of fractional-order mathematical models have been developed for analyzing and treating various diseases [1618]. Fractional derivative-based filtering has shown its suitability in overcoming the above problems due to its tuning parameter (fractional order parameter) and also enhancing the low-intensity texture [19]. Also, the effect of fractional derivative filters on computer-aided breast cancer detection schemes using thermogram has not been reported yet.

Therefore, within this paper, a computer-aided breast cancer or malignancy detection model using thermograms is presented which processes breast tissues (non-asymmetry based) using fractional derivative-based Sobel filter. The comparative performance of different popular texture feature sets is also performed with respect to fractional derivative order parameter alfa (α). This parameter alfa provides an additional degree of flexibility in compensating errors. Moreover, a hybrid feature set is also derived and compared. The comparative analysis shows its superiority over the other feature sets. The major contributions of this paper are:

•   A new hybrid feature set is derived by combining different feature sets and analyzed for breast cancer detection.

•   This paper introduces a histogram of linear bipolar pattern features (HLBP) for breast thermogram classification.

•   Comparative analysis of thermogram texture features used for breast cancer classification is also presented, which aids the literature.

•   A fractional derivative-based Sobel filter is applied for texture enhancement, noise reduction, and providing robustness against variations and degradations in thermograms. It also offers the vitality of optimizing the classification results.

•   The proposed model is more generalized and hence it can be applied to analyze thermal images acquired by different protocols/cameras used in different applications also such as skin cancer detection, peripheral vascular disease identification, night vision, surveillance, disease and pathogen detection in plants, etc.

The rest of the article is arranged in the following manner as: Section 2 describes the background theory of materials and methods. The proposed methodology including data-set and data pre-processing is provided in Section 3. Results and discussions are presented in Section 4. Section 5 and Section 6 give a brief discussion and conclude the findings, respectively.

1.1 Related Work

Owing to the limitations of currently used imaging modalities, thermal imaging is continuously being evaluated for breast cancer screening and detection. A brief literature review based on the wide range of research publications related to breast cancer detection and classification is being presented in this section.

The medical thermogram analysis is directly dependent on the quality of the thermogram which mainly depends on acquisition protocol, used thermal camera, and signal to noise ratio of thermogram [1]. The current status of infrared thermal imaging techniques in breast cancer detection, classification, and a few protocols to acquire thermograms have been studied by [3,6]. In general, all computer-aided automated and semi-automated thermogram-based cancer detection systems involve three basic steps, i. Pre-processing and segmentation of the region of interest (ROI): it normally includes background removal and ROI separation for further processing, ii. Texture enhancement and noise reduction in thermogram, iii. Appropriate feature extraction and classification [913,2026].

Thermal images have a low-intensity gradient, absence of clear edges, and high noise to signal ratio [27]. Therefore, the precise segmentation of ROI and analysis of breast cancer become inaccurate and difficult. Thus, many researchers have also reported manual segmentation of ROI and Left/Right regions for symmetrical analysis [1015,20,21]. In the case of breast cancer detection, the segmentation of ROI indicates the separation of breast tissues from the rest of the body and the background. Various semi-automatic and fully automatic ROI segmentation methods based on image processing techniques such as edge detection [15,20], region growing [21], thresholding [25,26], and morphological approaches have been delineated in literature [14,26]. Since the proposed work focuses only on breast cancer detection and classification, the ground truth masks of respective ROIs of the breast thermal images which are available in the user database have been utilized to achieve the maximum analysis accuracy [28].

In order to improve detection accuracy, researchers have applied several image enhancement and de-noising techniques in spatial and transform domains. The spatial filters such as gaussian, wiener filter and median filter, etc., blur the edges. While the transform-domain techniques like contourlet, wavelet, and curvelet with diffusion and adaptive anisotropic diffusion filtering have been widely employed to enhance and de-noise the thermal images. As thermal images have smooth transitions in intensity values, the wavelet-based de-noising also does not assist well the thermal images [27]. Some other techniques such as the BM3D technique based on enhanced sparse representation have been reported which are capable of sharpening and de-noising low contrast thermograms [7]. Recently, fractional derivative-based techniques have been applied to enhance the texture of various images as it preserves the weak textures while suppressing the noise in the images [29]. This approach has also been explored to enhance and segment medical images [19]. The tissue malignancy or tumors have abrupt textures in comparison to the normal tissues due to the process of angiogenesis. Therefore, the features having texture discrimination properties have been employed on thermal images for the segmentation of suspected regions, detection, and classification in many medical applications [30].

Consecutively, to automate the process of abnormality detection and classification in breast thermograms, different asymmetry-based analyses using machine learning techniques have been applied. A brief summary of the state-of-the-art schemes reported in the literature with user database, types of features, classifier, and the values of performance parameters accounted in the scheme are summarized in Table 1.

images

2  Background

In this section, the back ground theory of material and methods, required for implementation of the proposed model are presented.

2.1 Fractional Differential Filter

The Gru¨mwald–Letnikov definition of the fractional differential is a basic extension of the natural derivative to fractional one and is widely being used in image processing applications [29]. It is described for a function f(x) [a,b] using Eq. (1):

DGLf(x)=limh01hαm=0(1)m(m)f(xmh) (1)

where xb,h=(xa)n , n N, α is the order that is real number and includes fractional number. The binomial coefficient is calculated using Eq. (2):

(m)=(1)(2)(m+1)m! (2)

If I(x, y) be the image of size MXN, fractionalize image ΔαI(x,y) can be represented by using fractionalization algorithm described in Eqs. (3)(6):

lim∝→nΔf(t)=Δnf(t),n=0,1,2,. (3)

(ΔI(x,y))i,j=(ΔxI(x,y))i,j,(ΔyI(x,y))i,j (4)

where Δ represents an arbitrary operator

(ΔxI(x,y))i,j=k=0W(1)kCkI(x,y)ik,j (5)

(ΔyI(x,y))i,j=k=0W(1)kCkI(x,y)i,jk (6)

and W 3 is an integer constant, Ck=Γ(α+1)Γ(k+1)Γ(αk+1), and Γ is gamma function.

2.2 Texture Based Features

In this section, various textures-based features are discussed briefly.

2.2.1 First Order Statistical Features (FOS)

The first-order statistic features report gray intensity dispersion in an image. The commonly used features are mean, variance, kurtosis, skewness, energy, and entropy [8]. The details of FOS features are given in Appendix A.

2.2.2 Second Order Statistical Features (SOS)

The features calculated from second-order statistics provide the relative information or position of different gray levels within the image. SOS features measure the regularity, coarseness, and smoothness of the image pixels. The widely used methods for texture discriminations are mentioned below:

(a)Gray level co-occurrence matrix features (GLCM)

GLCM describes the textural details of an image and it is useful for classifications of images. These features are found using a co-occurrence matrix where pixels are considered in pairs and the gray level co-occurrence matrix reflects the relationship amongst all pixels or groups of pixels [35]. The GLCM represents a two-dimensional histogram which itself is a component of two parameters, the relative detachment between two pairs of pixels estimated in pixel numbers (d=1,2,3) and their relative direction θ(i.e.,θ=arctan(Δy/Δx)). The θ is the quantized orientation (00,450,900, and 1350) in four orientations, i.e., horizontal, diagonal, vertical, and anti-diagonal respectively. Also, the normalized co-occurrence matrix Cθ,d is given by the Eq. (7) as:

Cθ,d(k,l)(Tn{((x1,y1),(x2,y2))(M×N)×(M×N)|P}/K) (7)

where; P is a primary condition which satisfies the values: {Δx=dsinθ,Δy=dcosθ,I(x1,y1)=i,I(x2,y2)=j. Tn and K are the number of elements in the set and the total number of pairs of pixels respectively [27]. The detailed explanation with formulae of GLCM features are given in Appendix B [35].

(b) Grey level run length matrix features (GLRLM)

GLRLM is a method towards extracting second-order statistical features. The study shows that GLRLM can discriminate textures which can not be discriminated by GLCM based features extraction. This method computes the figure of gray level runs of different lengths. Where a gray level run is a set of linearly adjoining pixels of alike gray level values and the number of pixels within the run is gray level run length [36]. The GLRL matrix is represented by R(θ)=[r(k,l|θ)] where each element r(k,l|θ) specifies an approximation to the number of instances in an image and includes a run with length l for intensity k in directions of angle θ. Four GLRL matrices can be calculated for (00,450,900,and 1350) [36]. These GLRLM features are defined mathematically in Appendix C [36].

(c) Linear binary pattern features (LBP)

Discriminative power, computational simplicity, and rotation invariant linear binary pattern operator is a very popular approach in various applications of classification. This texture operator tags the image pixel by thresholding its neighborhood and specifies binary numbers to their neighbors as a result. It generates a P-dimensional histogram which is used as a texture descriptor [37]. The LBPP, R number that characterizes the image texture around the center pixel (xc, yc) with gray level value νc is given by Eq. (8):

LBPP,R=P=0P-1S(vP-vC)2P,S(v)={1ifv00else} (8)

where, P denotes the number of equally spaced pixels (with value vp) on a circle of radius R(R > 0) symmetrical about centre pixel.

(d) Histogram of oriented gradient features (HOG)

HOG feature descriptor outperforms significantly the other feature sets including wavelets for some applications. HOG is determined on an intense grid of equally gapped cells and also applies overlapping local contrast normalizations for better accomplishments. This is achieved by acquiring the local histogram over larger spatial regions labeled as blocks and using the outcome to normalize all of the cells in the block. The length LHOG of the HOG feature is based on the image size and some function parameter values as in Eq. (9) [38]:

LHOG=(Blocks per image)×(Blocksize)×(Number of bins)BlocksPerImage=[{(imagesize/Cellsize)Blocksize}(BlocksizeBlockoverlap)+1] (9)

2.2.3 Gabor Wavelet Features

Gabor features are particularly suitable for texture representation and discrimination. This feature fundamentally examines if there are any explicit frequency contents in particular orientations in a local area about a point or a region. A 2D Gabor function is achieved by modulating a 2D Gaussian kernel function by a complex sinusoidal plane wave with angular frequency ω as expressed in Eq. (10), where, σx and σy are spatial spreads, and θ represents the direction.

g(x,y,ω,θ,σx,σy)=12πσxσyexp[12{(xσx)2+(yσy)2}+jω(xcosθ+ysinθ)] (10)

Thus, Gabor features are constructed by using multiple filters on several frequencies (scale) and orientations θ [12].

3  The Proposed Model

The proposed automated breast cancer detection and classification model using fractional Sobel filter and support vector machine (SVM) with distinct texture features are described in this section. However, many schemes for breast cancer detection using thermogram with SVM reported in the literature have used either integer order filters or power-law transformation to get better signal-to-noise ratio and textural quality of thermal images [25]. Moreover, comparative analysis of different texture features of thermogram commonly used with SVM or any other classifier is also missing in the literature. A new thermogram-based model for breast cancer detection using fractional derivative-based fractional Sobel filter and SVM is presented in this manuscript. Also, a comparative analysis of different texture features with fractional derivative filter is presented [3943].

RGB color mapped thermograms obtained from the camera are first converted to gray images. This gray image conversion is essential because radiologists favor grayscale images as they comprise a greater resemblance to the mammographic images [1]. These gray scale thermograms are further processed for 1: Segmentation of breast tissues from the background (pre-processing and ROI segmentation), 2: Fractional Sobel filtering, 3: Extraction of different features and feature reduction employing Principal component analysis (PCA), 4: Training the RBF-kernel based SVM classifier using the reduced set of features and 5: Classification of breast tissues as normal or abnormal one. Steps involved in the proposed model are also depicted in Fig. 1. The evaluation of the efficacy of various feature sets with the RBF-SVM classifier is also performed by calculating performance parameters. The detail of the different steps of the proposed model is described below.

3.1 Pre-Processing and Region of Interest (ROI) Segmentation

The performance of the algorithm can greatly be enhanced by accurate segmentation of ROI. In this step, the breast tissues are separated from the background region. Following are the steps used:

(1)  All the input thermograms I(i, j) are converted to grayscale images.

(2)  The background subtraction is done by masking the thermal images with respective ground truth images [28]. The regions other than breast tissues such as shoulders, neck, armpits, and the region below the infra-mammary are cropped out manually.

(3)  The uniformity in the size of images is maintained while achieving the ROIs, i.e., IROI(i, j). Two groups of normal and abnormal ROIs are prepared.

images

Figure 1: Schematic-diagram of the proposed thermogram adaptive breast cancer detection

3.2 Fractional Derivative Based Sobel Filtering

Fractional derivative-based Sobel filter termed as fractional Sobel filter in this paper is employed to enhance the ROI segmented thermograms IROI(i, j). The fractional derivative based filter improves the thermal image texture quality and intensity gradient while restraining the noise enhancement [26]. Thegradient components of Sobel operator can be formed to get fractional order differential forms as shown in Eqs. (11), (12) using Eqs. (5), (6):

Gxα=12(αI(x+1,y1)xα+2αI(x+1,y)xα+αI(x+1,y+1)xα) (11)

Gxα=12(αI(x1,y+1)xα+2αI(x,y+1)xα+αI(x+1,y+1)xα) (12)

Thus, a fractional-order Sobel convolution operator for x and y directions are found by approximating Eqs. (11) and (12) and are shown in Fig. 2. Where, Ck=Γ(α+1)Γ(k+1)Γ(αk+1), Γ is gamma function, k=0,1,2,3

Further, this fractional Sobel filter is applied on all thermograms. Eq. (13) depicts the masking operation on image IROI with fractional mask wα(p,q), where a and b have the values k/2 and 1, respectively.

IROI(i,j)=p=aaq=bbw(p,q)IROI(i+p,j+q) (13)

images

Figure 2: 3 × 3 Fractional-order Sobel convolution operators (a) For x and (b) For y directions

Also, Fig. 3 shows the fractional derivative based Sobel masks (Wy, Wx) along with the values of filter coefficients in terms of fraction α and Wx is transpose matrix of Wy [19].

images

Figure 3: Fractional derivative-based Sobel mask

The Sobel fractional derivative filter masks of size 5×3 are applied to cut the computational complexity of filtering step and to enhance the discriminative power of texture features.

3.3 Feature Extraction and Dimensionality Reduction

In the proposed model, two well-founded and proficient texture feature sets are extracted from enhanced ROIs such as first-order statistical features (FOS), higher-order statistical features (HOS). Higher-order statistical features include gray level co-occurrence matrix (GLCM), gray level run length matrix (GLRLM), histogram of oriented gradient (HOG), and Histogram of a linear binary pattern (HLBP). These features imitate the association among the intensities of two image pixels or pixel sets and determine the image properties related to FOS and HOS. Gabor wavelet features capture the locality, frequency, orientation, and generate multi-resolution texture information concerning both spatial and frequency domains is also calculated. A hybrid set of statistical features is also formed by combining first and second-order statistical features, HOG, and HLBP features. The principal component analysis is done for reducing the dimensionality of the feature sets.

3.3.1 Feature Extraction

With the purpose to characterize breast thermogram and to generate a dataset for classification total of six feature extraction methods, based on effective texture are employed. First-order statistical features, second-order statistical features (GLCM, GLRLM), HOG, HLBP, Gabor wavelet, and a hybrid set of statistical features as described in Section 2 are extracted and quantified as explained below:

(1)  First-order statistical features Mean, standard deviation, variance, kurtosis, skewness, entropy, and energy are extracted.

(2)  Twenty-one GLCM features are extracted at distance d=1 and 2 in four orientations θ=0, 45, 90, and 135, respectively.

(3)  Seven GLRLM features such as SRE, LRE, GLN, RLN, RP, LGLRE, and HGLRE are also extracted in four orientations θ=0,45,90, and 135, respectively.

(4)  HOG features are based on horizontal and vertical gradients. The image is divided into cells having several evenly spaced orientation bins. An unsigned gradient of 0 to 180 divided into bins is used here. A nine bins histogram corresponding to the orientation of each pixel is generated using linear-gradient voting. The contrast normalization of local responses is also performed on overlapping blocks for every cell. Each block consists of 4 non-overlapping cells of size [8×8] with 9 histogram bins. Therefore, a total of 1, 764 features (36 features per block) are found. The performance of the descriptor is directly proportional to the size of histogram bins which characterizes the texture of tissue regions.

(5)  HLBP features are extracted by generating a p-dimensional histogram of the image. The values of P = 8 and R = 1 are taken for the purpose of this study.

(6)  Gabor wavelet features are computed by convolving Gabor-wavelet filters with the image. Gabor wavelet filters are generated for five distinct scales in eight orientations respectively with a window of size of 39×39. Down-sampling is also applied to reduce the number of Gabor features.

(7)  Hybrid feature set is formed by combining first and second-order statistical features, HOG features as well as HLBP features. The experimentations are performed by taking different combinations of feature sets. However, the Gabor feature captures the local texture in the frequency domain but the detection accuracy is low and dimensionality is very high, hence not included in the hybrid feature set.

Fig. 4 depicts the total sets of features extracted from thermograms. Total 7 FOS features, 168 GLCM features, 28 GLRLM features, 1764 HOG features, and 256 HLBP features make a 1967-dimensional statistical feature vector and Gabor feature vectors with the dimension of 40960 are extracted.

images

Figure 4: Extracted and reduced feature sets

3.3.2 Dimensionality Reduction of Features

The feature vectors attained from the previous step are of very high dimensions and it becomes computationally intensive to process such big data. Therefore, a linear dimension reduction technique principal component analysis is employed to slash down the dimensions of feature vector sets. Dimensionality reduction also makes the algorithms more efficient to generate more accurate predictions using machine learning algorithms. As described in [8], “PCA orthogonally transforms a set of (possibly) correlated variables in a minor set of uncorrelated variables called principal components and the number of principal components is same or less than the original variables present in dataset”. The first principal component locates the maximum variability (eigenvalue) in data and each of the succeeding components has variability in decreasing order. If PCA has Vn non-zero eigenvectors, the optimal number of eigenvectors Vp must be picked according to the Eq. (14) to keep the average projection error to be less than 0.01.

i=1VpSii=1VnSi0.99 (14)

where, Si represents the ith eigen value. The dataset must be normalized to zero mean and unit variance before applying PCA on it and 99% of the variance is kept by the feature vectors used for the next step of training and testing the classifier. Reduced sets of feature vectors for all types of texture features are also depicted in Fig. 4.

3.4 Classification and Performance Evaluation

The reduced sets of feature vectors extracted from thermal images are presented as a binary classification problem and the dataset consists of feature vectors of normal and malignant classes. These vectors are further employed to train a supervised learning technique, support vector machine (SVM) with RBF kernel for classification. An SVM makes a hyper plane or a group of hyper planes in a large or infinite-dimensional space. These hyper-plane are used to distinguish the two classes as the transformed dataset develops into more distinguishable in comparison to the original input dataset [10]. hyper-planes are decision boundaries that facilitate to classify the data points and the dimension of these hyperplanes is decided by the number of features. Support vectors are data points that are closer to the hyperplane and affect the orientation of the hyperplane. The data points residing on either side of the hyperplane can be characterized to different classes.

In the present work, SVM-RBF is trained with the training set of feature vectors and predictions are made for the unseen testing set. A five-fold cross-validation technique is owned to validate the model. The performance of trained classifier to identify the breast malignancy is evaluated in provisions of parameters; Specificity, Sensitivity, Accuracy and Area under the curve [1517]:

3.4.1 Sensitivity

It is the percentage of actual positives rightly identified as positives by the classifier and is computed as:

Sensitivity=TP(TP+FN)100 (15)

3.4.2 Specificity

It is also known as true negative rate and is the capacity to spot the negative samples. It is computed as:

Specificity=TN(TN+FP)100 (16)

3.4.3 Accuarcy

Accuracy defines the measure of the correctness of the classifier. It can be calculated as:

Accuracy=(TP+TN)TotalData 100 (17)

3.4.4 Area under the Curve (AUC)

AUC measures the quality of the classifier. AUC is the amount of area under the receiver operating characteristics (ROC) curve which is obtained by plotting sensitivity vs. (1-specificity). Its value is between 0 and 1. The quality of diagnostic test is better if it has AUC value approaching to 1, where, TN: True negative, TP: True positive, FP: False positive and FN: False negative.

4  Results and Discussion

4.1 Experimental Set-Up and Dataset

Computer simulation outcomes of the anticipated model using MATLAB are presented in this section. Breast thermograms for this research work are taken from the database readily available under the project PROENG, captured by FLIR Thermal Cam S45. The acquisition method, protocol, and other details of the thermograms are given by [28] for further study. Sample breast thermograms taken from the selected database are depicted in Fig. 5 [28], which shows the normal and abnormal thermal patterns indicating the presence or absence of suspicious regions in the breast tissues. Total of randomly selected 130 (83 normal and 47 abnormal) IR images of size 320 × 240 are used for implementing the method. However, to avoid the overfitting problem, a few images have been augmented and a database of 180 (90 normal and 90 abnormal) IR images have been prepared. The number of thermal images used is also comparable with state-of-art schemes.

images

Figure 5: Sample breast thermogram images (a) Both normal (b)–(d) [28]

4.2 Results

The pre-processing, ROI segmentation, and fractional Sobel filtering with order α= 0.2 results for four sample test thermograms used to verify the proposed model, are shown in Fig. 6. In pursuance of separating the background and segmenting the breast tissues, the required ground truth masks available in the database are used [28]. Fig. 6a shows gray-scale normal IR_3830, IR_0737 and abnormal IR_4149, IR_8285 breast thermograms, their respective ground truth region of interest (ROI), and masks are shown in Figs. 6b and 6c, respectively. Fig. 6d shows the background subtracted thermograms, whereas Fig. 6e depicts the background-subtracted thermograms with breast tissues only. The segmented thermograms (ROIs) are now processed through a fractional derivative-based Sobel filter as explained in Section 3.2 to enhance the images. It is noted here that the fractional-order derivative filter (FODF) considers more information of neighboring pixels, extracting more image details. Thus, it enhances the edges and preserves the weak and medium textures details simultaneously, removing the noise [27].

images

Figure 6: Pre-processing, ROI segmentation and fractional derivative filtering using fractional Sobel filter of order α= 0.2 steps for four sample test thermograms (1. IR_3830, 2. IR_0737, 3. IR_4149 and 4. IR_8285 [28]) (a) Original thermograms (b) Ground truth segmentation boundaries [28] (c) Respective ground truth masks [28] (d) Background-subtracted thermograms (e) Background subtracted thermograms with breast tissues only (ROIs) (f) Fractional Sobel filtered ROI thermograms

A fractional-order Sobel mask with k = 4 is used and α is varied from 0 to 1 with interval 0.1. Fig. 6f shows the fractional-order derivative filtered (FODF) thermal images for the value of fraction α=0.2 in fractional Sobel filter (5×3Wx and 3×5Wy) for k = 4. Experiments are performed for different values of k, but the results are better for the masks of size k=4, i.e., 5×3 Wx and 3×5Wy, Hence these values are selected in the proposed model.

To study the effect of fractional derivative Sobel filter on thermal images; the quantitative analysis of gray level co-occurrence matrix (derived from the database of normal and abnormal images) which describes the comprehensive information of texture is done. A set of GLCM features (described in Section 2) is extracted in four directions 0,45,90, and 135). It is observed that the magnitude of features extracted in different directions varies in a similar manner irrespective of the direction of extraction. Thus, Tables 2 and 3 represent a few of selected features (Energy, Contrast, Entropy, Correlation, Sum of average, Sum of entropy, Information measure of Correlation 1, Information measure of Correlation 2) which are extracted in direction (0). It can clearly be observed that the discrimination between normal and abnormal thermograms with respect to fraction α arises prominently between α=0.2 and α=0.4.

images

Further, the five sets of features (as mentioned in Section 3) including, first and second-order statistical, HOG, HLBP, Gabor, and a hybrid set of statistical features are extracted from every segmented thermal image, respectively. As the dimensions of feature sets are very high, PCA is applied for dimensionality reduction. Now, these feature sets are fed to SVM-RBF for the classification of breast thermograms. It is also mentioned here that, experimentations are performed to investigate the performance of SVM with different kernel functions such as linear, RBF, etc. but the results of the RBF kernel are more improved, so the RBF kernel is used in the proposed model.

images

For evaluating the performance of distinct feature sets, significant classification parameters such as accuracy, specificity, and sensitivity to evaluate the trained classifier are calculated. The variation of these performance constraints for each feature set with fractional derivative parameter alfa (α) is shown in Figs. 79.

It can clearly be observed from Figs. 79 that the performance parameters have the most suitable values for the fraction order of α=0.2, for all the set of features, i.e., Gabor Features, HOGfeatures, HLBPfeatures, Statisticalfeatures, Hybrid statistical features represented as F1, F2, F3, F4, and F5, respectively. The hybrid feature (F5) has the superior performance values for the fractions α=0.2,0.3,0.4 while the optimum values of fraction order α for statistical features, HOG, HLBP and Gabor features are α=0.3,0.2,0.2 and 0.3, respectively.

It is also observed from Table 4 that the results of the proposed model with hybrid features at fraction α=0.2 outperform the other feature sets with accuracy, sensitivity specificity, and area under the curve to be 94.44%, 95.55%, 92.22%, 96.11%, respectively. Fig. 10 also confirms that the HLBP feature performs comparatively better than other feature sets in all aspects of performance except that of the hybrid feature set. Further, the performance of the proposed model excels the recent state-of-the-art techniques for breast cancer detection and classification as depicted in Table 5.

images

Figure 7: Variation of performance parameter (Accuracy) with alfa

images

Figure 8: Variation of performance parameter (Sensitivity) with alfa

images

Figure 9: Variation of performance parameter (Specificity) with alfa

images

images

Figure 10: Performance parameters of proposed hybrid feature set along with other feature sets

images

4.3 Discussion

Asymmetry analysis-based schemes limit their performance when both breast tissues have similar abnormalities and tissue regions, because of the features measuring the abnormality result in the nonappearance of abnormality. The proposed model characterizes the thermal patterns of individual breast tissues and discovers the abnormalities. Evaluation results show that the proposed model with fractional order filtering, specific feature selection technique, classifier, explicit parameters, and the five-fold cross-validation achieves the highest performance with hybrid texture features at fraction order α=0.2.

It is evident from the comparative analysis of the features for multiple values of alfa (Figs. 79) that the performance of the feature set is sensitive to the value of fraction α, hence providing robustness against noise and errors by providing an additional degree of flexibility.

Comparative analysis of different features (Gabor, HOG and statistical, HLBP) presented in this paper aids the literature. HLBP features are evaluated for the first time in this paper which gives the classification accuracy of 90.55% and other performance parameters are also comparable to the state-of-art schemes.

It is worth mentioning here that the smaller size lesions and early detection problems of medical imaging modalities such as mammography, MRI, etc. could be overcome up to some extent by the proposed model. Moreover, the use of fractional order filter makes the model more generalized with an iterative selection of fractions alfa for required performance in diverse thermogram acquisition protocols and respective applications such as skin cancer, thyroid, diabetic foot, peripheral vascular disease, pathogen detection in plants, in night vision and surveillance, etc.

5  Conclusion

A new fractional-order derivative and hybrid feature set dependent thermogram adaptive computer-aided breast cancer detection model is implemented. Performance of the two new feature sets of thermogram including HLBP and hybrid feature sets are also analyzed. The hybrid texture feature set is derived by combining different texture features for improved classification accuracy. A comparative study of hybrid feature set with other popular statistical and texture features for different values of fractional order α is also performed. For fraction α=0.2, the hybrid feature set outperforms the other feature sets and the existing techniques as well. Similarly, the HLBP texture feature set also outperforms the other feature sets except the hybrid feature set. The comparison results verify the efficacy of the proposed model, hence effectively distinguishing the normal and abnormal cases. The proposed model provides flexibility to adapt the fraction order for optimizing the classification performance against errors and degradations in the thermogram. Therefore, it is more generalized and can be used to analyze the thermal infrared images acquired by different protocols/cameras for applications other than breast cancer, such as skin cancer detection, peripheral vascular disease identification, night vision, surveillance, disease and pathogen detection in plants, etc. in an IoT environment.

Funding Statement: We would like to thank all the faculty members and technicians who provided us their scientific guidance and assistance in completing this study. Praveen Agarwal, thanks to the SERB (Project TAR/2018/000001), DST (Projects DST/INT/DAAD/P-21/2019 and INT/RUS/RFBR/308), and NBHM (DAE) (Project 02011/12/2020 NBHM (R.P)/RD II/7867).

Conflicts of Interest: The authors declare that they have no conflicts of interest to report regarding the present study.

References

  1. Silva, L. F., Santos, A. A., Bravo, R. S., Silva, A. C., & Muchaluat-Saade, D. C. (2016). Hybrid analysis for indicating patients with breast cancer using temperature time series. Computer Methods and Programs in Biomedicine, 130, 142-153. [Google Scholar] [CrossRef]
  2. Etehadtavakol, M., Ng, E. Y. K. (2017). An overview of medical infrared imaging in breast abnormalities detection. In: Ng, E., Etehadtavakol, M. (eds.), Application of infrared to biomedical sciences, series in bio engineering, pp. 45–47. Springer, Singapore. DOI 10.1007/978-981-10-3147-2_4. [CrossRef]
  3. Raghavendra, U., Gudigar, A., Rao, T. N., Ciaccio, E. J., & Ng, E. Y. K. (2019). Computer-aided diagnosis for the identification of breast cancer using thermogram images: A comprehensive review. Infrared Physics and Technology, 102(1), 103041. [Google Scholar] [CrossRef]
  4. Ng, E. Y. K. (2009). A review of thermography as promising non-invasive detection modality for breast tumor. International Journal of Thermal Sciences, 48(5), 849-859. [Google Scholar] [CrossRef]
  5. Ng, E. Y. K., & Sudharsan, N. M. (2001). Numerical computation as a tool to aid thermographic interpretation. International Journal of Medicine Engineering Technology, 25(2), 53-60. [Google Scholar] [CrossRef]
  6. Kandlikar, S. G., Raya, I. P., Raghupati, P. A., Gonzalez-Hernandez, J., & Dabydeen, D. (2017). Infrared imaging technology for breast cancer detection-current status, protocol and new directions. International Journal of Heat and Mass Transfer, 108(6), 2303-2320. [Google Scholar] [CrossRef]
  7. Prabha, S., Sujatha, C. M., Ramakrishnan, S. (2014). Asymmetry analysis of breast thermograms using bm3d technique and statistical texture features. International Conference on Informatics Electronics Vision, pp. 1–4. DOI 10.1109/ICIEV.2014.6850730. [CrossRef]
  8. Gonzalez, R. C., Woods, R. E. (2001). Digital image processing. 2nd ed. USA: Prentice Hall.
  9. Schaefer, G., Zavisek, M., & Nakashima, N. (2012). Thermography based breast cancer analysis using statistical features and fuzzy classification. Pattern Recognition, 42(6), 1133-1137. [Google Scholar] [CrossRef]
  10. Acharya, U. R., Ng, E. Y. K., Tan, J. H., & Sree, S. V. (2012). Thermography based breast cancer detection using texture features and support vector machine. Journal of Medical Systems, 36(3), 1503-1510. [Google Scholar] [CrossRef]
  11. Mookiah, M. R. K., Acharya, U. R., & Ng, E. Y. K. (2012). Data mining technique in breast cancer detection in thermograms using hybrid feature extraction strategy. Quantitative Infrared Thermography Journal, 9(2), 151-165. [Google Scholar] [CrossRef]
  12. Suganthi, S. S., & Ramakrishnan, S. (2014). Analysis of breast thermograms using gabor wavelet anisotropy index. Journal of Medical Systems, 38, 101. [Google Scholar] [CrossRef]
  13. Acharya, U. R., Ng, E. Y. K., Sree, S. V., Chua, C. K., Chattopadhyay, S. (2014). Higher order spectra analysis of breast thermograms for the automated identification of breast cancer. Expert Systems, (1), 37–47. DOI 10.1111/j.1468-0394.2012.00654.x. [CrossRef]
  14. Araujo, M. C., Lima, R. C. F., & DeSouza, R. M. C. R. (2014). Interval symbolic feature extraction for thermography breast cancer detection. Expert Systems with Applications, 41(15), 6728-6737. [Google Scholar] [CrossRef]
  15. Etehadtavakol, M., Chandran, V., Ng, E. Y. K., & Kafieh, R. (2013). Breast cancer detection from thermal images using bispectral invariant features. International Journal of Thermal Sciences, 69, 21-36. [Google Scholar] [CrossRef]
  16. Naik, P. A., Zu, J., & Owolabi, K. M. (2020). Global dynamics of a fractional order model for the transmission of HIV epidemic with optimal control. Chaos Solitons Fractals, 138(2), 109826. [Google Scholar] [CrossRef]
  17. Naik, P. A., Yavuz, M., Qureshi, S., Zu, J., & Townley, S. (2020). Modeling and analysis of COVID-19 epidemics with treatment in fractional derivatives using real data from Pakistan. European Physical Journal Plus, 135(10), 795. [Google Scholar] [CrossRef]
  18. Owolabi, K. M. (2016). Numerical solution of diffusive HBV model in a fractional medium. Springer Plus, 5(1), 1643. [Google Scholar] [CrossRef]
  19. Tian, D., Xue, D., Chen, D., Sun, S. (2013). A fractional-order regulatory CV model for brain MR image segmentation. Control and Decision Conference, pp. 37–40. DOI 10.1109/CCDC.2013.6560890. [CrossRef]
  20. Etehadtavakol, M., Chandran, V., Ng, E. Y. K., & Rabbani, H. (2013). Separable and non-separable discrete wavelet transform based texture features and image classification of breast thermograms. Infrared Physics & Technology, 61(5), 274-286. [Google Scholar] [CrossRef]
  21. Francis, S. V., & Sasikala, M. (2013). Automatic detection of abnormal breast thermograms using asymmetry analysis of texture features. Journal of Medical Engineering and Technology, 37(1), 17-21. [Google Scholar] [CrossRef]
  22. Suganthi, S., & Ramakrishnan, S. (2014). Anisotropic diffusion filter-based edge enhancement for segmentation of breast thermogram using level sets. Biomedical Signal Processing and Control, 10(5), 128-136. [Google Scholar] [CrossRef]
  23. Francis, S. V., Sasikala, M., & Saranya, S. (2014). Detection of breast abnormality from thermograms using curvelet transform based feature extraction. Journal of Medical Systems, 38(4), 23. [Google Scholar] [CrossRef]
  24. Raghvendra, U., Acharya, U. R., Ng, E. Y. K., Tan, J. H., & Gudigar, A. (2016). An integrated index for breast cancer identification using histogram of oriented gradient and kernel locality preserving projection. Quantitative Infrared Thermography Journal, 13(2), 195-209. [Google Scholar] [CrossRef]
  25. Garduno-Ramon, M. A., Vega-Mancilla, S. G., Morales-Henandez, L. A., & Osomio-Rios, R. A. (2017). Supportive noninvasive tool for the diagnosis of breast cancer using a thermographic camera as sensor. Sensors (Switzerland), 17(3), 497. [Google Scholar] [CrossRef]
  26. Gogoi, U. R., Bhowmik, M. K., Bhattacharjee, D., & Ghosh, A. K. (2018). Singular value based characterization and analysis of thermal patches for early breast abnormality detection. Australian Physical & Engineering Sciences in Medicine, 41(4), 861-879. [Google Scholar] [CrossRef]
  27. Lin, C. L., Chang, Y., Kuo, C., Huang, H., Jian, E. (2010). A fast-denoising approach to corrupted infrared images. International Conference on System Science and Engineering, pp. 207–211. DOI 10.1109/ICSSE.2010.5551743. [CrossRef]
  28. Image processing and image analyses applied to mastology. 2020. https://visual.ic.uff.br/en/proeng/.
  29. Pu, Y. F., Zhou, J. L., & Yuan, X. (2010). Fractional differential mask: A fractional differential-based approach for multiscale texture enhancement. IEEE Transactions on Image Processing, 19(2), 491-511. [Google Scholar] [CrossRef]
  30. Rong, S., Zhou, H., Zhao, D., Cheng, K., & Qian, K. (2018). Infrared fix pattern noise reduction method based on shearlet transform. Infrared Physics Technology, 91(5), 243-249. [Google Scholar] [CrossRef]
  31. Chebbah, N. K., Ouslim, M., & Temmar, R. (2018). A new approach for breast abnormality detection based on thermography. Medical Technologies Journal, 2(3), 257-265. [Google Scholar] [CrossRef]
  32. Singh, D., & Singh, A. K. (2020). Role of image thermography in early breast cancer detection-past, present and future. Computer Methods and Programms Biology, 183, 105074. [Google Scholar] [CrossRef]
  33. Zuluaga-Gomez, J., Masry, Z. A., Benaggoune, K., Meraghni, S., & Zerhouni, N. (2020). A CNN-based methodology for breast cancer diagnosis using thermal images. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 9(2), 131-145. [Google Scholar] [CrossRef]
  34. Sanchez-Ruiz, D., Olmos-Pineda, I., & Olvera-Lopez, J. A. (2020). Automatic region of interest segmentation for breast thermogram image classification. Pattern Recognition Letters, 135(10), 72-81. [Google Scholar] [CrossRef]
  35. Clausi, D. A. (2002). An analysis of co-occurrence texture statistics as a function of grey level quantization. Canadian Journal of Remote Sensing, 28(1), 45-62. [Google Scholar] [CrossRef]
  36. Dasarathy, B. V., & Holder, E. B. (1991). Image characterizations based on joint gray-level run-length distributions. Pattern Recognition Letters, 12(8), 497-502. [Google Scholar] [CrossRef]
  37. Ojala, T., Pietikainen, M., & Maenpaa, T. (2002). Multiresolution gray scale and rotation invariant texture classification with local binary patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(7), 971-987. [Google Scholar] [CrossRef]
  38. Dalal, N., Triggs, B. (2005). Histograms of oriented gradients for human detection. Conference on Computer Vision and Pattern Recognition, pp. 886–893. DOI 10.1109/CVPR.2005.177. [CrossRef]
  39. Agarwal, P., Deniïz, S., Jain, S., Alderremy, A. A., & Aly, S. (2020). A new analysis of a partial differential equation arising in biology and population genetics via semi analytical techniques. Physica A: Statistical Mechanics and its Applications, 542(1–2), 122769. [Google Scholar] [CrossRef]
  40. Rehman, A., Singh, R., & Agarwal, P. (2021). Modeling, analysis and prediction of new variants of covid-19 and dengue co-infection on complex network. Chaos Solitons & Fractals, 150(1), 111008. [Google Scholar] [CrossRef]
  41. Günay, B., Agarwal, P., Guirao, J. L. G., & Momani, S. (2021). A Fractional approach to a computational eco-epidemiological model with holling type-ii functional response. Symmetry, 13(7), 1159. [Google Scholar] [CrossRef]
  42. Agarwal, P., Singh, R., & Rehman, A. (2021). Numerical solution of hybrid mathematical model of dengue transmission with relapse and memory via Adam-Bashforth–Moulton predictor-corrector scheme. Chaos, Solitons & Fractals, 143(1), 110564. [Google Scholar] [CrossRef]
  43. Agarwal, P., & Singh, R. (2020). Modelling of transmission dynamics of Nipah virus (Niv): A fractional order approach. Physica A: Statistical Mechanics and its Applications, 547(1), 124243. [Google Scholar] [CrossRef]

Appendix A

Mean(μ)=k=0M1kP(k) (A.1)

Standard Deviation(σ)=k=0M1(kμ)1/2P(k) (A.2)

Variance(σ2)=k=0M1(kμ)2P(k) (A.3)

Skewness(μ3)=σ3k=0M1(kμ)3P(k) (A.4)

Kurtosis(μ4)=σ4k=0M1(kμ)4P(k)3 (A.5)

Energy(E)=k=0M1[P(k)2] (A.6)

Entropy(H)=k=0M1P(k)log2[P(k)] (A.7)

where, M is maximum gray level value in the image. P(k) is the probability of the gray levels and is given by:P(k)=n(k)/N, where n(k) and N are total number of pixels ofgray level (k) and pixels respectively in an image.

Appendix B

Angular second moment(Energy)=i=1Mj=1M(Cij)2 (B.1)

Contrast=n=0M1n2{i=1Mj=1MCij} (B.2)

Correlation=i=1Mj=1M(ij)μxμyσxσy (B.3)

Entropy=i=1Mj=1MCijlog2Cij (B.4)

Inverse differencei=1Mj=1MCij1+|(ij)| (B.5)

Inverse difference moment (homo)=i=1Mj=1MCij1+(ij)2 (B.6)

Inverse Difference Moment Normalized=i=1Mj=1MCij1+I(ij)/MI (B.7)

Inverse Difference Normalized=i=1Mj=1MCij1+{ijM}2 (B.8)

Auto correlation=i=1Mj=1M(ij)Cij (B.9)

Dissimilarity=i=1Mj=1MijCij (B.10)

Cluster Prominence=i=1Mj=1M(i+jμx+μy)4Cij (B.11)

Cluster shade=i=1Mj=1M(i+jμx+μy)3Cij (B.12)

Maximum probability=MAXCijij (B.13)

Sum average=i=22MiCx+y(i) (B.14)

Sum entropy=i=22MCx+y(i)log2Cx+y(i) (B.15)

Sum of squares=i=1Mj=1M(iμ)2Cij (B.16)

Sum variance=i=22M(iSum entropy)2Cx+y(i) (B.17)

Difference entropy=i=1MCxy(i)log2Cxy(i) (B.18)

Difference variance=variance ofCxy (B.19)

Information measure of correlation1=HxyHxy1max(Hx,Hy) (B.20)

Information measure of correlation2=[1exp(2.0(Hxy2Hxy))]1/2 (B.21)

where Ci,j is (i,j)th entry in the normalized GLCM and the mean and standard deviation of rows and columns are given by:

Cx(i)=j=1MC(i,j),Cy(i)=j=1MC(i,j),

Cx+y(k)=i=1Mj=1MCij, (i+j=k = 2,3,2M) Cxy(k)=i=1Mj=1MCij, (i −j=k = 0,1,M −1)

µx=i=1Mj=1Mi.Cij, µy=i=1Mj=1Mj.Cij, σx=i=1Mj=1M(iµx)2Cij,

σy=i=1Mj=1M(jμy)2Cij,

Hx and Hy are the entropy of Cx and Cy

Hxy=i=1Mj=1MCijlog2Cij , Hxy1=i=1Mj=1MCijlog2{Cx(i)Cy(j)}

Hxy2=i=1Mj=1MCx(i)Cy(j)log2{Cx(i)Cy(j)}

Appendix C

Short run emphasis (SRE)=i=1Ml=1Lr(i,l|θ)l2/i=1Ml=1Lr(i,l|θ) (C.1)

Long run emphasis (LRE)=i=1Ml=1Ll2.r(i,l|θ)/i=1Ml=1Lr(i,l|θ) (C.2)

Gray level non-uniformity(GLN)=i=1M(l=1Lr(i,l|θ))2/i=1Ml=1Lr(i,l|θ) (C.3)

Run lengt non-uniformity(RLN)=l=1L(i=1Mr(i,l|θ))2/i=1Ml=1Lr(i,l|θ) (C.4)

Run percentage (RP)=1Ni=1Ml=1Lr(i,l|θ) (C.5)

Low gray level run emphasis(LGLRE)=i=1Ml=1Lr(i,l|θ)i2/i=1Ml=1Lr(i,l|θ) (C.6)

High gray level run emphasis(HGLRE)=i=1Ml=1Li2.r(i,l|θ)/i=1Ml=1Lr(i,l|θ) (C.7)

where M and N are the total figure of gray levels and pixels in an image respectively while L is the longest run.

images This work is licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.