A new modification nonlinear conjugate gradient method with strong wolf-powell line search
Abstract
The conjugate gradient method has played a special role in solving large-scale unconstrained Optimization problems. In this paper, we propose a new family of CG coefficients that possess sufficient descent conditions and global convergence properties this CG method is similar to (Wei et al) [7]. Global convergence result is established under Strong Wolf-Powell line search. Numerical results to find the optimum solution of some test functions show the new proposed formula has the best result in CPU time and the number of iterations, and the number of gradient evaluations when it comparing with FR, PRP, DY, and WYL
Keywords
unconstrained optimization conjugate gradient , global convergence, Strong Wolf Powell Line Search,
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v18.i1.pp525-532
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).