A new three-term conjugate gradient method for training neural networks with global convergence
Abstract
Conjugate gradient methods (CG) constitute excellent neural network training methods that are simplicity, flexibility, numerical efficiency, and low memory requirements. In this paper, we introduce a new three-term conjugate gradient method, for solving optimization problems and it has been tested on artificial neural networks (ANN) for training a feed-forward neural network. The new method satisfied the descent condition and sufficient descent condition. Global convergence of the new (NTTCG) method has been tested. The results of numerical experiences on some wellknown test function shown that our new modified method is very effective, by relying on the number of functions evaluation and number of iterations, also included the numerical results for training feed-forward neural networks with other well-known method in this field.
Keywords
Artificial neural networks; Descent condition; Global convergent property; Sufficient descent condition; Three-term conjugate gradient; Unconstrained optimization
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v28.i1.pp551-558
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).