Categorical encoder based performance comparison in pre-processing imbalanced multiclass classification
Abstract
The contribution of this study is to offer suggestions for coding techniques for categorical predictor variables and comprehensive test scenarios to obtain significant performance results for imbalanced multiclass classification problems. We modify scenarios in the data mining process with the sample, explore, modify, model, and assess (SEMMA) framework coupled with statistical hypothesis testing to generalize the model performance evaluation conclusions as enhanced-SEMMA. We selected four open-source data sets with unequal class distributions and categorical predictors. Ordinal, nominal, dirichlet, frequency, target, leave one, one hot, dummy, binary, and hashing encoder methods are used. We use the grid-search technique to find the best hyperparameters. The F1-Score and area under the curve (AUC) are evaluated to select the optimal model. In all datasets with 10-fold stratified cross-validation and 95% to 99% accuracy for each dataset, the results show that support vector machine (SVM) outperforms the decision tree (DT) K-nearest neighbor (KNN), Naïve Bayes (NB), logistic regression (LR), and random forest (RF) algorithms. Probability-based or binary encodings, such as target, Dirichlet, dummy, one-hot, or binary, are best for situations with less than 3% of minor class proportions. Nominal or ordinal encoders are preferred for data with a minor class proportion of more than 3%.
Keywords
Categorical encoding; Classification; Imbalanced; Multiclass; Performance analysis
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v31.i3.pp1705-1715
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).