New memoryless self-scaling quasi Newton strategy on large scale unconstrained optimization problems
Abstract
In unconstrained optimization algorithms, we employ the memoryless quasi Newton procedure to construct a new conjugacy coefficient for the conjugate gradient approaches. This newer updating formula was adapted by scaling the well-known broyden fletcher glodfarb shanno (BFGS) formula by a selfscaling factor in order to reach to the new form of the conjugacy coefficient which makes a satisfactory result in the descent direction and satisfies the globally convergent features when compared the proposed method to HS standard conjugate gradient approach. The theorems are studied in detail and moreover the numerical results of this paper is depend on a Fortran programming which are extremely stable.
Keywords
Conjugacy coefficient; Conjugate gradient; Descent direction; Globlal convergence; Memory less
Full Text:
PDFDOI: http://doi.org/10.11591/ijeecs.v28.i1.pp339-345
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Indonesian Journal of Electrical Engineering and Computer Science (IJEECS)
p-ISSN: 2502-4752, e-ISSN: 2502-4760
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).