Modified Harris Hawks optimizer for feature selection and support vector machine kernels

Hadeel Tariq Ibrahim, Wamidh Jalil Mazher, Enas Mahmood Jassim


The support vector machine (SVM), one of the most effective learning algorithms, has many real-world applications. The kernel type and its parameters have a significant impact on the SVM algorithm's effectiveness and performance. In machine learning, choosing the feature subset is a crucial step, especially when working with high-dimensional data sets. These crucial criteria were treated independently in the majority of earlier studies. In this research, we suggest a hybrid strategy based on the Harris Hawk optimization (HHO) algorithm. HHO is one of the lately suggested metaheuristic algorithms that has been demonstrated to be used more efficiently in facing some optimization problems. The suggested method optimizes the SVM model parameters while also locating the optimal features subset. We ran the proposed approach HHO-SVM on real biomedical datasets with 17 types of cancer for Iraqi patients in 2010-2012. The experimental results demonstrate the supremacy of the proposed HHO-SVM in terms of three performance metrics: feature selection accuracy, runtime, and number of selected features. The suggested method is contrasted with four well-known algorithms for verification: firefly (FF) algorithm, genetic algorithm (GA), grasshopper optimization algorithm (GOA), and particle swarm algorithm (PSO). The implementation of the proposed HHO-SVM approach reveals 99.967% average accuracy.


Biomedical datasets; Feature selection; Harris Hawk optimizer; Heuristic optimization; Support vector machine

Full Text:




  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

shopify stats IJEECS visitor statistics