A modified type of Fletcher-Reeves conjugate gradient method with its global convergence

Amna Weis Mohammed Ahmad Idress, Osman Omer Osman Yousif, Abdulgader Zaid Almaymuni, Awad Abdelrahman Abdalla Mohammed, Mohammed A. Saleh, Nafisa A. Ali

Abstract


The conjugate gradient methods are one of the most important techniques used to address problems involving minimization or maximization, especially nonlinear optimization problems with no constraints at all. That is because of their simplicity and low memory needed. They can be applied in many areas, such as economics, engineering, neural networks, image restoration, machine learning, and deep learning. The convergence of Fletcher-Reeves (FR) conjugate gradient method has been established under both exact and strong Wolfe line searches. However, it is performance in practice is poor. In this paper, to get good numerical performance from the FR method, a little modification is done. The global convergence of the modified version has been established for general nonlinear functions. Preliminary numerical results show that the modified method is very efficient in terms of number of iterations and CPU time.


Keywords


Conjugate gradient method; Exact line search; Fletcher-Reeves method; Global convergence; Unconstrained optimization

Full Text:

PDF


DOI: http://doi.org/10.11591/ijeecs.v33.i1.pp425-432

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

shopify stats IJEECS visitor statistics