In machine learning and data mining, feature selection (FS) is a traditional and complicated optimization problem. Since the run time increases exponentially, FS is treated as an NP-hard problem. The researcher’s effort to build a new FS solution was inspired by the ongoing need for an efficient FS framework and the success rates of swarming outcomes in different optimization scenarios. This paper presents two binary variants of a Hunger Games Search Optimization (HGSO) algorithm based on V- and S-shaped transfer functions within a wrapper FS model for choosing the best features from a large dataset. The proposed technique transforms the continuous HGSO into a binary variant using V- and S-shaped transfer functions (BHGSO-V and BHGSO-S). To validate the accuracy, 16 famous UCI datasets are considered and compared with different state-of-the-art metaheuristic binary algorithms. The findings demonstrate that BHGSO-V achieves better performance in terms of the selected number of features, classification accuracy, run time, and fitness values than other state-of-the-art algorithms. The results demonstrate that the BHGSO-V algorithm can reduce dimensionality and choose the most helpful features for classification problems. The proposed BHGSO-V achieves 95% average classification accuracy for most of the datasets, and run time is less than 5 sec. for low and medium dimensional datasets and less than 10 sec for high dimensional datasets.

Due to the significant advancement of technology, including the Internet in various areas, many databases were recently developed, and the complexity and diversity have also grown. Nevertheless, high-dimensional databases have some drawbacks, including lengthy model building periods, incomplete features, and deteriorated efficiency, making data analysis challenging [

Conventional optimization approaches are incapable of solving complex optimization tasks, and obtaining acceptable results is challenging. As a result, a more efficient approach, known as the metaheuristic algorithm, was already suggested and used by many researchers. Optimization algorithms have several benefits, including their ease of use, independence from the problem, versatility, and gradient-free design [

The authors of [

Therefore, in this paper, a binary variant of a recently proposed Hunger Games Search Optimization (HGSO) algorithm [

A new binary variant of the HGSO algorithm is formulated using different transfer functions.

BHGSO-V and BHGSO-S algorithms are applied to low, medium, and high dimensional FS problems.

The performance of the BHGSO algorithm is compared with other state-of-the-art algorithms.

Statistical tests, such as Friedman’s test and Wilcoxon Signed Rank test, have been conducted.

The structure of the paper is organized as follows. Section 2 explains the basic concepts of the HGSO algorithm. Section 3 explains how the continuous HGSO algorithm is altered to a binary version using V- and S-shaped transfer functions. Section 4 discusses the results and further discussion while validating the performance of the proposed algorithm using 16 UCI datasets. Section 5 concluded the paper.

The Hunger Games Search Optimization (HGSO) algorithm was introduced by Yang et al. [

The approaching behavior of hunger is mathematically modeled in this subsection. The game instructions are presented in

The hunger behavior of all individuals during the search is mathematically modeled in this subsection. The expression for

Initialize the variables, such as |

Initialize the individuals’ positions _{i} |

Find the cost function value of all populations |

Update _{b} |

Find the _{1}, and _{2} using |

Find |

Update |

_{b} |

The HGSO algorithm is a recently developed population-based algorithm that imitates hunger hunting behavior for food. In terms of avoidance of local optima, exploitation, exploration, and convergence, the HGSO algorithm outperforms the other population-based algorithms. It is proved by the inventors of the HGSO algorithm that the HGSO performs better on benchmark functions. Due to a better balance between the exploration and exploitation phases of the HGSO algorithm, the convergence and solution diversity is better in the HGSO algorithm. Therefore, the benefits mentioned above motivated the researchers to use the HGSO algorithm in real-world applications, including wrapper-based FS problems, due to its appealing properties.

In the wrapper-based FS process, the classification model is used for training and validation at each phase, and then a sophisticated optimization technique is used to reduce the number of iterations. Besides, the search space is likely to be highly nonlinear, with numerous local optima. Generally, continuous optimization techniques predict feature combinations that optimize classification efficiency, and populations are used in the search space with d-dimension at positions [0, 1]. In contrast, binary versions are supposed to perform well if used similarly since the search space is restricted to two values for every dimension (0, 1). Furthermore, binary operators are easier to understand than continuous operators [

As discussed, the new positions of the hunger found through local or global search can have continuous output, but these continuous positions should be converted into binary. This transformation is accomplished by changing continuous positions of each dimension utilizing a Sigmoidal (S-shaped) TF, which directs the hunger to travel in a binary location [^{th} hunger in the ^{th} dimension at the current iteration ^{th} population at iteration ^{th} dimension.

Rather than an S-shaped TF, a V-shaped TF approach is defined in this paper, and

FS is a binary optimization process in which the populations can only choose between [1 or 0]. A one-dimensional vector represents any solution; the length is determined by the number of attributes (_{f}

In general, the FS problem is a multi-objective optimization problem in which two conflicting objectives, such as choosing the smallest _{f}_{f}_{f}_{f}_{f}

Sixteen benchmark datasets from the UCI data source were selected for experimentation to verify the output of the proposed binary approaches.

Data set | Set No. | Number of features | Number of instances | Dimension | Source |
---|---|---|---|---|---|

Arrhythmia | DS1 | 279 | 452 | Medium | UCI |

Coimbra | DS2 | 10 | 216 | Low | |

Breast cancer | DS3 | 10 | 699 | Low | |

COIL20 | DS4 | 1024 | 1440 | High | |

Colon | DS5 | 2000 | 62 | High | |

Glass | DS6 | 10 | 214 | Low | |

Heart failure | DS7 | 13 | 299 | Low | |

Horse | DS8 | 28 | 368 | Low | |

Ionosphere | DS9 | 34 | 351 | Medium | |

Leukemia | DS10 | 7070 | 72 | High | |

Lung | DS11 | 3312 | 203 | High | |

Lymphography | DS12 | 18 | 48 | Low | |

ORL | DS13 | 1024 | 400 | High | |

TOX-171 | DS14 | 5748 | 171 | High | |

Yale | DS15 | 1024 | 165 | High | |

Zoo | DS16 | 16 | 101 | Low |

Based on the KNN classifier, a wrapper-based method for FS has been used in this paper, with the best option (

Algorithms | Parameters | Values |
---|---|---|

Common to all algorithms | K for cross-validation | 5 |

Number of iterations | 100 | |

Population size | 10 | |

Number of runs | 30 | |

Dimensions | Number of features | |

Domain | {0, 1}-Binary | |

BHGSO | _{CR} |
0.8, 0.2 and 0.8 |

BEO [ |
_{1}, _{2}, and |
2, 1, and 0.5, respectively |

BMPA [ |
0.5, 0.5 | |

BSCA [ |
a | 2 |

BGWO [ |
a | [0, 2] |

HLBDA [ |
0.4, 0.7 | |

BASO [ |
50 and 0.2 |

In each run of all algorithms, the following procedures are applied to the datasets.

_{i}_{i}

_{f}

^{th} run.

The convergence curves of proposed BHGSO-V and BHGSO-S algorithms and other selected algorithms on sixteen datasets are shown in

The best fitness, mean fitness, worst fitness, and standard deviation (STD) of fitness obtained by all algorithms are presented in

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 1.554E-01 | 1.665E-01 | 1.557E-01 | 1.556E-01 | 1.920E-01 | 2.023E-01 | 1.909E-01 | |

DS2 | ||||||||

DS3 | 3.861E-02 | |||||||

DS4 | 4.883E-04 | 8.008E-04 | 4.854E-03 | 4.092E-03 | 1.479E-02 | 1.511E-02 | 1.145E-02 | |

DS5 | 8.000E-05 | 6.500E-05 | 3.000E-05 | 1.695E-01 | 1.693E-01 | 8.709E-02 | ||

DS6 | ||||||||

DS7 | 2.548E-01 | |||||||

DS8 | 7.143E-04 | 7.143E-04 | 4.247E-02 | 1.786E-03 | ||||

DS9 | 2.976E-02 | 2.946E-02 | 2.917E-02 | 3.034E-02 | 4.478E-02 | 7.307E-02 | 5.863E-02 | |

DS10 | 1.683E-04 | 3.678E-05 | 2.546E-05 | 5.516E-05 | 1.461E-01 | 1.460E-01 | 1.460E-01 | |

DS11 | 2.492E-02 | 2.489E-02 | 2.484E-02 | 2.490E-02 | 2.946E-02 | 2.942E-02 | 2.949E-02 | |

DS12 | 1.046E-01 | 7.216E-02 | 1.046E-01 | 1.057E-01 | ||||

DS13 | 6.338E-02 | 3.881E-02 | 5.076E-02 | 5.205E-02 | 7.885E-02 | 7.874E-02 | 6.646E-02 | |

DS14 | 3.077E-02 | 6.005E-02 | 3.113E-02 | 9.027E-02 | 9.226E-02 | 1.211E-01 | 6.302E-02 | |

DS15 | 2.105E-01 | 2.412E-01 | 1.810E-01 | 2.405E-01 | 2.746E-01 | 2.746E-01 | 2.744E-01 | |

DS16 | 1.250E-03 |

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 1.859E-01 | 1.739E-01 | 1.834E-01 | 1.986E-01 | 2.224E-01 | 2.242E-01 | 2.005E-01 | |

DS2 | 6.905E-01 | 7.518E-01 | 6.469E-01 | |||||

DS3 | 2.049E-02 | 3.249E-02 | 4.144E-02 | 2.374E-02 | ||||

DS4 | 8.506E-03 | 4.398E-03 | 4.984E-03 | 1.026E-02 | 1.571E-02 | 2.103E-02 | 1.496E-02 | |

DS5 | 6.000E-05 | 1.030E-04 | 1.140E-04 | 9.800E-05 | 1.696E-01 | 1.860E-01 | 1.531E-01 | |

DS6 | 1.200E-03 | 2.600E-03 | ||||||

DS7 | 2.564E-01 | 4.833E-01 | 2.329E-01 | |||||

DS8 | 1.263E-01 | 1.456E-02 | 1.571E-03 | 1.930E-01 | 4.026E-02 | |||

DS9 | 4.649E-02 | 4.466E-02 | 3.494E-02 | 3.541E-02 | 5.992E-02 | 9.127E-02 | 5.963E-02 | |

DS10 | 4.260E-02 | 4.498E-05 | 4.328E-05 | 2.836E-02 | 1.462E-01 | 1.461E-01 | 1.460E-01 | |

DS11 | 3.488E-02 | 2.534E-02 | 2.498E-02 | 2.541E-02 | 3.943E-02 | 5.411E-02 | 4.427E-02 | |

DS12 | 9.209E-02 | 1.192E-01 | 9.187E-02 | 1.059E-01 | 1.057E-01 | 1.067E-01 | 1.064E-01 | |

DS13 | 4.787E-02 | 6.483E-02 | 5.614E-02 | 5.721E-02 | 8.640E-02 | 9.117E-02 | 7.652E-02 | |

DS14 | 8.429E-02 | 6.505E-02 | 6.044E-02 | 1.021E-01 | 1.156E-01 | 1.446E-01 | 1.271E-01 | |

DS15 | 2.346E-01 | 2.601E-01 | 2.109E-01 | 2.597E-01 | 2.747E-01 | 3.224E-01 | 2.746E-01 | |

DS16 | 7.500E-04 | 2.250E-03 | 7.500E-04 |

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 2.107E-02 | 3.408E-02 | 1.455E-02 | 3.436E-02 | 2.098E-02 | 1.334E-02 | 9.430E-03 | |

DS2 | 1.190E-01 | 1.393E-01 | 9.670E-02 | |||||

DS3 | 7.265E-03 | 8.981E-03 | 3.753E-03 | 8.898E-03 | ||||

DS4 | 2.812E-03 | 2.567E-03 | 2.937E-03 | 5.839E-03 | 2.719E-03 | 4.351E-03 | 2.436E-03 | |

DS5 | 2.329E-05 | 4.962E-05 | 3.818E-05 | 7.538E-05 | 1.153E-04 | 3.686E-02 | 3.688E-02 | |

DS6 | 4.472E-04 | 1.517E-03 | ||||||

DS7 | 8.392E-02 | 2.924E-01 | 1.446E-02 | |||||

DS8 | 1.148E-01 | 3.176E-02 | 8.222E-04 | 1.153E-01 | 4.168E-02 | |||

DS9 | 1.166E-02 | 1.075E-02 | 6.001E-03 | 7.643E-03 | 1.290E-02 | 1.045E-02 | 1.883E-02 | |

DS10 | 6.323E-02 | 2.801E-05 | 3.696E-05 | 3.870E-02 | 2.936E-05 | 3.334E-05 | 2.716E-05 | |

DS11 | 4.689E-04 | 1.357E-02 | 1.115E-02 | 5.736E-04 | 1.346E-02 | 1.746E-02 | 1.346E-02 | |

DS12 | 1.821E-02 | 1.891E-02 | 1.830E-02 | 1.799E-02 | 2.435E-02 | 8.784E-04 | 2.457E-02 | |

DS13 | 5.532E-03 | 8.610E-03 | 6.835E-03 | 6.365E-03 | 6.811E-03 | 1.240E-02 | 1.052E-02 | |

DS14 | 2.398E-02 | 2.472E-02 | 2.039E-02 | 1.570E-02 | 2.436E-02 | 2.437E-02 | 3.800E-02 | |

DS15 | 1.667E-02 | 2.512E-02 | 4.037E-02 | 2.994E-02 | 1.748E-02 | 2.673E-02 | 1.559E-04 | |

DS16 | 2.795E-04 | 9.479E-04 | 2.795E-04 |

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 0.813 | 0.827 | 0.818 | 0.802 | 0.780 | 0.778 | 0.802 | |

DS2 | 0.357 | 0.304 | 0.243 | 0.348 | ||||

DS3 | 0.983 | 0.977 | 0.971 | 0.963 | 0.980 | |||

DS4 | 0.993 | 0.996 | 0.992 | 0.989 | 0.983 | 0.990 | ||

DS5 | 0.833 | 0.817 | 0.850 | |||||

DS6 | ||||||||

DS7 | 0.780 | 0.776 | 0.780 | 0.742 | 0.780 | 0.515 | 0.766 | |

DS8 | 0.874 | 0.986 | 0.808 | 0.962 | ||||

DS9 | 0.957 | 0.960 | 0.954 | 0.943 | 0.911 | 0.943 | ||

DS10 | 0.957 | 0.971 | 0.857 | 0.857 | 0.857 | |||

DS11 | 0.965 | 0.975 | 0.975 | 0.975 | 0.965 | 0.950 | 0.960 | |

DS12 | 0.883 | 0.897 | 0.897 | 0.897 | 0.897 | 0.897 | ||

DS13 | 0.938 | 0.948 | 0.945 | 0.945 | 0.918 | 0.913 | 0.928 | |

DS14 | 0.935 | 0.918 | 0.941 | 0.900 | 0.888 | 0.859 | 0.876 | |

DS15 | 0.764 | 0.739 | 0.788 | 0.739 | 0.727 | 0.679 | 0.727 | |

DS16 |

The average accuracy and STD of the classification accuracy values obtained by all algorithms are listed in

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 2.137E-02 | 3.388E-02 | 1.491E-02 | 3.461E-02 | 2.137E-02 | 1.361E-02 | 9.296E-03 | |

DS2 | 7.778E-02 | 1.191E-01 | 1.395E-01 | 9.722E-02 | ||||

DS3 | 6.435E-03 | 7.881E-03 | 8.811E-03 | 3.217E-03 | 7.881E-03 | |||

DS4 | 2.455E-03 | 2.905E-03 | 2.455E-03 | 2.905E-03 | 4.658E-03 | 4.527E-03 | 2.455E-03 | |

DS5 | 3.727E-02 | 3.727E-02 | ||||||

DS6 | ||||||||

DS7 | 7.580E-03 | 8.338E-02 | 2.942E-01 | 1.418E-02 | ||||

DS8 | 1.152E-01 | 3.063E-02 | 1.158E-01 | 4.155E-02 | ||||

DS9 | 1.010E-02 | 6.389E-03 | 7.825E-03 | 1.278E-02 | 1.010E-02 | 1.863E-02 | 1.195E-02 | |

DS10 | 6.389E-02 | 3.912E-02 | ||||||

DS11 | 1.369E-02 | 1.118E-02 | 1.369E-02 | 1.768E-02 | 1.369E-02 | |||

DS12 | 1.889E-02 | 1.889E-02 | 2.438E-02 | 1.889E-02 | 2.438E-02 | 2.438E-02 | ||

DS13 | 5.590E-03 | 1.046E-02 | 6.847E-03 | 6.847E-03 | 6.847E-03 | 1.250E-02 | 1.046E-02 | |

DS14 | 2.461E-02 | 2.461E-02 | 1.315E-02 | 2.080E-02 | 2.461E-02 | 2.461E-02 | 3.835E-02 | |

DS15 | 1.660E-02 | 2.535E-02 | 4.066E-02 | 3.030E-02 | 1.660E-02 | 2.710E-02 | ||

DS16 |

The mean and STD of the selection of the features from the large datasets are shown in

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 85.00 | 65.60 | 56.80 | 79.40 | 128.20 | 118.20 | 130.00 | |

DS2 | 1.80 | 1.80 | 2.80 | 1.20 | ||||

DS3 | 3.40 | 4.40 | 4.00 | 4.40 | 3.80 | |||

DS4 | 167.00 | 100.60 | 88.00 | 206.20 | 482.20 | 463.60 | 476.40 | |

DS5 | 22.80 | 12.00 | 22.00 | 19.60 | 922.60 | 897.60 | 912.00 | |

DS6 | 2.60 | 1.20 | 2.60 | |||||

DS7 | 1.20 | 1.80 | 4.40 | 1.80 | ||||

DS8 | 4.40 | 2.00 | 2.80 | 4.40 | 8.80 | 6.40 | ||

DS9 | 7.60 | 4.20 | 5.00 | 11.40 | 12.20 | 10.40 | ||

DS10 | 121.60 | 34.00 | 30.60 | 52.40 | 3344.40 | 3268.00 | 3234.40 | |

DS11 | 195.40 | 75.60 | 116.80 | 219.20 | 1584.60 | 1525.40 | 1547.40 | |

DS12 | 6.00 | 5.60 | 5.60 | 6.20 | 6.00 | 7.80 | 7.20 | |

DS13 | 303.00 | 151.00 | 172.80 | 282.80 | 484.20 | 465.00 | 485.80 | |

DS14 | 1588.00 | 833.60 | 1266.40 | 1778.60 | 2826.60 | 2758.20 | 2781.80 | |

DS15 | 211.60 | 134.60 | 87.80 | 176.40 | 484.60 | 454.20 | 471.40 | |

DS16 | 1.20 | 2.60 | 3.60 | 1.20 |

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 22.58 | 14.54 | 14.04 | 35.27 | 8.50 | 8.50 | 14.58 | |

DS2 | 0.84 | 1.10 | 1.30 | 0.45 | ||||

DS3 | 0.89 | 1.34 | 1.00 | 1.34 | 1.10 | |||

DS4 | 48.73 | 11.67 | 17.13 | 55.04 | 131.79 | 14.82 | 15.65 | |

DS5 | 4.66 | 9.92 | 8.28 | 15.08 | 23.06 | 26.54 | 41.65 | |

DS6 | 1.14 | 0.45 | 1.52 | |||||

DS7 | 0.45 | 1.79 | 1.67 | 0.84 | ||||

DS8 | 3.29 | 1.00 | 4.02 | 2.30 | 2.49 | 1.67 | ||

DS9 | 3.21 | 1.14 | 2.17 | 1.41 | 3.29 | 2.59 | 2.51 | |

DS10 | 19.81 | 16.96 | 26.13 | 35.80 | 20.76 | 23.57 | 19.20 | |

DS11 | 155.31 | 83.44 | 35.77 | 189.96 | 56.74 | 29.67 | 40.43 | |

DS12 | 1.58 | 1.30 | 1.34 | 1.48 | 1.58 | 2.49 | 0.84 | |

DS13 | 135.91 | 19.38 | 32.25 | 29.41 | 74.62 | 13.69 | 16.72 | |

DS14 | 314.82 | 274.09 | 171.23 | 510.41 | 286.04 | 54.99 | 50.80 | |

DS15 | 62.76 | 32.45 | 23.53 | 110.02 | 15.88 | 10.76 | 15.96 | |

DS16 | 0.45 | 1.34 | 1.52 | 0.45 |

A few statistical techniques are used to assess the proposed BHGSO algorithm’s efficacy. Several statistical non-parametric tests are discussed in the literature. The statistical analysis in this study is separated into two parts. First, the Friedman rank test (FRT) is utilized to assess all algorithms’ accuracy. The authors discovered that there is a substantial difference among all algorithms in this paper. The FRT of all algorithms is shown in

Algorithm | BHGSO-V | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|---|

DS1 | 3.8 | 2.8 | 3.8 | 5 | 7 | 6.6 | 5.2 | |

DS2 | 3.8 | 3.8 | 3.8 | 5.4 | 3.8 | 7.2 | 4.4 | |

DS3 | 3.3 | 3.3 | 4 | 6.4 | 3.3 | 7.8 | 4.6 | |

DS4 | 2.2 | 2.6 | 4.2 | 4.4 | 6.4 | 7.4 | 6.8 | |

DS5 | 2.6 | 3.8 | 4.2 | 3.2 | 7.4 | 7 | 6.6 | |

DS6 | 4.1 | 4.1 | 4.1 | 4.9 | 4.1 | 6.5 | 4.1 | |

DS7 | 3.6 | 3.6 | 3.6 | 4.5 | 3.6 | 7.8 | 5.7 | |

DS8 | 6.5 | 2.4 | 2.4 | 3.4 | 5.1 | 7.6 | 6.2 | |

DS9 | 2.5 | 4.7 | 3.8 | 2.9 | 6.4 | 8 | 6 | |

DS10 | 4.8 | 2.6 | 2 | 4.2 | 8 | 7 | 6 | |

DS11 | 4 | 2.6 | 3.6 | 3.6 | 6.8 | 7.2 | 6.4 | |

DS12 | 2.6 | 6 | 3.6 | 4.5 | 5.5 | 5.4 | 6 | |

DS13 | 3.4 | 2.2 | 4.4 | 3.8 | 7.4 | 7 | 6.6 | |

DS14 | 2.2 | 3.2 | 3.8 | 5 | 6.8 | 7.4 | 6.4 | |

DS15 | 2.8 | 2.2 | 4.2 | 4 | 6.7 | 8 | 6.3 | |

DS16 | 3.8 | 3.8 | 4.5 | 3.8 | 3.8 | 8 | 4.5 | |

Average | 2.42 | 3.84 | 2.94 | 3.75 | 4.31 | 5.76 | 7.24 | 5.74 |

Overall Rank | 4 | 2 | 3 | 5 | 7 | 8 | 6 |

Algorithm | BHGSO-S | BEO | BMPA | HLBDA | BGWO | BSCA | BASO |
---|---|---|---|---|---|---|---|

DS1 | 1 | 0.625 | 0.3125 | 0.4375 | 0.125 | 0.1875 | 0.4375 |

DS2 | 1 | 1 | 1 | 0.5 | 0.5 | 0.125 | 1 |

DS3 | 1 | 1 | 1 | 0.25 | 1 | 0.0625 | 1 |

DS4 | 0.0625 | 0.125 | 0.0625 | 0.625 | 0.0625 | 0.0625 | 0.0625 |

DS5 | 0.125 | 0.6875 | 0.0625 | 0.625 | 0.0625 | 0.0625 | 0.0625 |

DS6 | 1 | 1 | 1 | 1 | 1 | 0.25 | 1 |

DS7 | 1 | 1 | 1 | 1 | 1 | 0.0625 | 0.25 |

DS8 | 0.0625 | 1 | 1 | 1 | 0.0625 | 0.0625 | 0.0625 |

DS9 | 0.875 | 0.0625 | 0.3125 | 0.3125 | 0.125 | 0.0625 | 0.0625 |

DS10 | 0.0625 | 0.625 | 0.625 | 0.0625 | 0.0625 | 0.0625 | 0.0625 |

DS11 | 0.8125 | 0.125 | 0.125 | 1 | 0.0625 | 0.0625 | 0.0625 |

DS12 | 0.125 | 0.625 | 0.875 | 0.75 | 0.25 | 0.25 | 0.125 |

DS13 | 0.0625 | 0.0625 | 0.125 | 0.3125 | 0.0625 | 0.0625 | 0.0625 |

DS14 | 0.3125 | 0.0625 | 0.3125 | 0.3125 | 0.0625 | 0.0625 | 0.125 |

DS15 | 0.1875 | 0.125 | 0.0625 | 0.8125 | 0.0625 | 0.0625 | 0.0625 |

DS16 | 1 | 1 | 1 | 1 | 1 | 0.0625 | 1 |

The proposed BHGSO algorithm was determined to be the best FS method based on the findings. The BHGSO-V algorithm enhanced the global optimum detection over the complex databases in terms of convergence and a minimum feature selection. The proposed BHGSO algorithms can able to sustain a decent acceleration during the iterations. The results in

Binary versions of the HGSO algorithm are introduced and used to address the FS problems in wrapper form in this study. Either using S-shaped or V-shaped TFs, the continuous variant of the HGSO is converted to a binary variant. The proposed techniques can be used for FS in machine learning to evaluate various algorithms’ searching abilities. The FS problem is expressed as a multiobjective problem with an objective function reflecting dimensionality reduction and classification accuracy. To evaluate the output, 16 datasets from the UCI repository were selected. For evaluation, the suggested BHGSO are used in the FS problems, and the experimental outcomes have been compared to advanced FS methods such as BEO, BMPA, HLBDA, BGWO, BSCA, and BASO. To evaluate various aspects of results, the assessment uses a collection of evaluation criteria. On most datasets, experimental findings showed that the proposed BHGSO lead to better outcomes than other strategies. Furthermore, the findings, such as classification accuracy (>95%) and run time (<5 s for low and medium dimensional problems and <10 s for high dimensional problem), demonstrate that using a BHGSO with a V-shaped TF can expressively boost the performance of HGSO in terms of the number of features selected and classification accuracy. The experimental results reveal that the BHGSO-V searches the feature set more efficiently and converges to the best solution faster than other optimizations. The continuous HGSO was also successfully transformed into binary variants that can address several discrete problems, including the task scheduling, traveling salesman, and knapsack problems.