Near Optimal Convergence of Back-Propagation Method using Harmony Search Algorithm
Abstract
Training Artificial Neural Networks (ANNs) is of great significanceand a difficult task in the field of supervised learning as its performance depends on underlying training algorithm as well as the achievement of the training process. In this paper, three training algorithms namely Back-Propagation Algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of Multi-Layer Perceptron feed forward type of Neural Networks (NNs) by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA-MLP, HSA-MLP and BP-MLP. The proposed method is empirically tested and verified using five benchmark classification problemswhich are Iris, Glass, Cancer, Wine and Thyroid dataset on training NNs. The MSE, training time, and classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed method has better results in terms of convergence error and classification accuracy compared to BP-MLP and HSA-MLPmaking the BPHSA-MLPa promising algorithm for neural network training.
Keywords
Full Text:
PDFRefbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).