A modified type of Fletcher-Reeves conjugate gradient method with its global convergence
Abstract
The conjugate gradient methods are one of the most important techniques used to address problems involving minimization or maximization, especially nonlinear optimization problems with no constraints at all. That is because of their simplicity and low memory needed. They can be applied in many areas, such as economics, engineering, neural networks, image restoration, machine learning, and deep learning. The convergence of Fletcher-Reeves (FR) conjugate gradient method has been established under both exact and strong Wolfe line searches. However, it is performance in practice is poor. In this paper, to get good numerical performance from the FR method, a little modification is done. The global convergence of the modified version has been established for general nonlinear functions. Preliminary numerical results show that the modified method is very efficient in terms of number of iterations and CPU time.
Keywords
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v33.i1.pp425-432
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).