Multi-layer perceptron hyperparameter optimization using Jaya algorithm for disease classification

Andien Dwi Novika, Abba Suganda Girsang

Abstract


This study introduces an innovative hyperparameter optimization approach for enhancing multilayer perceptrons (MLP) using the Jaya algorithm. Addressing the crucial role of hyperparameter tuning in MLP’s performance, the Jaya algorithm, inspired by social behavior, emerges as a promising optimization technique without algorithm-specific parameters. Systematic application of Jaya dynamically adjusts hyperparameter values, leading to notable improvements in convergence speeds and model generalization. Quantitatively, the Jaya algorithm consistently achieves convergences at first iteration, faster convergence compared to conventional methods, resulting in 7% higher accuracy levels on several datasets. This research contributes to hyperparameter optimization, offering a practical and effective solution for optimizing MLP in diverse applications, with implications for improved computational efficiency and model performance.

Keywords


Classification; Hyperparameter optimization; Jaya algorithm; Metaheuristic algorithm; Multilayer perceptrons

Full Text:

PDF


DOI: http://doi.org/10.11591/ijeecs.v35.i1.pp620-630

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

shopify stats IJEECS visitor statistics