Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (4)
  • Open Access

    ARTICLE

    Curve Classification Based on Mean-Variance Feature Weighting and Its Application

    Zewen Zhang1, Sheng Zhou1, Chunzheng Cao1,2,*

    CMC-Computers, Materials & Continua, Vol.79, No.2, pp. 2465-2480, 2024, DOI:10.32604/cmc.2024.049605 - 15 May 2024

    Abstract The classification of functional data has drawn much attention in recent years. The main challenge is representing infinite-dimensional functional data by finite-dimensional features while utilizing those features to achieve better classification accuracy. In this paper, we propose a mean-variance-based (MV) feature weighting method for classifying functional data or functional curves. In the feature extraction stage, each sample curve is approximated by B-splines to transfer features to the coefficients of the spline basis. After that, a feature weighting approach based on statistical principles is introduced by comprehensively considering the between-class differences and within-class variations of the… More >

  • Open Access

    ARTICLE

    Industrial Food Quality Analysis Using New k-Nearest-Neighbour methods

    Omar Fetitah1, Ibrahim M. Almanjahie2,3, Mohammed Kadi Attouch1,*, Salah Khardani4

    CMC-Computers, Materials & Continua, Vol.67, No.2, pp. 2681-2694, 2021, DOI:10.32604/cmc.2021.015469 - 05 February 2021

    Abstract The problem of predicting continuous scalar outcomes from functional predictors has received high levels of interest in recent years in many fields, especially in the food industry. The k-nearest neighbor (k-NN) method of Near-Infrared Reflectance (NIR) analysis is practical, relatively easy to implement, and becoming one of the most popular methods for conducting food quality based on NIR data. The k-NN is often named k nearest neighbor classifier when it is used for classifying categorical variables, while it is called k-nearest neighbor regression when it is applied for predicting noncategorical variables. The objective of this paper is to… More >

  • Open Access

    ARTICLE

    Application of FCM Algorithm Combined with Articial Neural Network in TBM Operation Data

    Jingyi Fang1, Xueguan Song2, Nianmin Yao3, Maolin Shi2,*

    CMES-Computer Modeling in Engineering & Sciences, Vol.126, No.1, pp. 397-417, 2021, DOI:10.32604/cmes.2021.012895 - 22 December 2020

    Abstract Fuzzy clustering theory is widely used in data mining of full-face tunnel boring machine. However, the traditional fuzzy clustering algorithm based on objective function is difficult to effectively cluster functional data. We propose a new Fuzzy clustering algorithm, namely FCM–ANN algorithm. The algorithm replaces the clustering prototype of the FCM algorithm with the predicted value of the articial neural network. This makes the algorithm not only satisfy the clustering based on the traditional similarity criterion, but also can effectively cluster the functional data. In this paper, we rst use the t-test as an evaluation index… More >

  • Open Access

    ARTICLE

    The k Nearest Neighbors Estimator of the M-Regression in Functional Statistics

    Ahmed Bachir1, *, Ibrahim Mufrah Almanjahie1, 2, Mohammed Kadi Attouch3

    CMC-Computers, Materials & Continua, Vol.65, No.3, pp. 2049-2064, 2020, DOI:10.32604/cmc.2020.011491 - 16 September 2020

    Abstract It is well known that the nonparametric estimation of the regression function is highly sensitive to the presence of even a small proportion of outliers in the data. To solve the problem of typical observations when the covariates of the nonparametric component are functional, the robust estimates for the regression parameter and regression operator are introduced. The main propose of the paper is to consider data-driven methods of selecting the number of neighbors in order to make the proposed processes fully automatic. We use the More >

Displaying 1-10 on page 1 of 4. Per Page