Quick Search Adv. Search

Journal of Bionic Engineering ›› 2024, Vol. 21 ›› Issue (6): 3123-3150.doi: 10.1007/s42235-024-00580-w

Previous Articles     Next Articles

Feature Selection Based on Improved White Shark Optimizer

 Qianqian Cui1 · Shijie Zhao1,2,3  · Miao Chen1 · Qiuli Zhao1   

  1. 1. Institute of Intelligence Science and Optimization, Liaoning Technical University, Fuxin 123000, China  2. School of Geomatics, Liaoning Technical University, Fuxin 123000, China  3. Institute for Optimization and Decision Analytics, Liaoning Technical University, Fuxin 123000, China
  • Online:2024-12-20 Published:2024-12-17
  • Contact: Shijie Zhao E-mail:zhaoshijie@lntu.edu.cn
  • About author: Qianqian Cui1 · Shijie Zhao1,2,3 · Miao Chen1 · Qiuli Zhao1

Abstract: Feature Selection (FS) is an optimization problem that aims to downscale and improve the quality of a dataset by retaining relevant features while excluding redundant ones. It enhances the classification accuracy of a dataset and holds a crucial position in the field of data mining. Utilizing metaheuristic algorithms for selecting feature subsets contributes to optimizing the FS problem. The White Shark Optimizer (WSO), as a metaheuristic algorithm, primarily simulates the behavior of great white sharks’ sense of hearing and smelling during swimming and hunting. However, it fails to consider their other randomly occurring behaviors, for example, Tail Slapping and Clustered Together behaviors. The Tail Slapping behavior can increase population diversity and improve the global search performance of the algorithm. The Clustered Together behavior includes access to food and mating, which can change the direction of local search and enhance local utilization. It incorporates Tail Slapping and Clustered Together behavior into the original algorithm to propose an Improved White Shark Optimizer (IWSO). The two behaviors and the presented IWSO are tested separately using the CEC2017 benchmark functions, and the test results of IWSO are compared with other metaheuristic algorithms, which proves that IWSO combining the two behaviors has a stronger search capability. Feature selection can be mathematically described as a weighted combination of feature subset size and classification error rate as an optimization model, which is iteratively optimized using discretized IWSO which combines with K-Nearest Neighbor (KNN) on 16 benchmark datasets and the results are compared with 7 metaheuristics. Experimental results show that the IWSO is more capable in selecting feature subsets and improving classification accuracy.

Key words: Metaheuristic algorithm , · Feature Selection , · White Shark Optimizer , · K-Nearest Neighbor