Research Repository

Multi-step nonlinear conjugate gradient methods for unconstrained minimization

Ford, John A and Narushima, Yasushi and Yabe, Hiroshi (2008) 'Multi-step nonlinear conjugate gradient methods for unconstrained minimization.' Computational Optimization and Applications, 40 (2). pp. 191-216. ISSN 0926-6003

Full text not available from this repository.


Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87–101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203–225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357–370, 1993; J. Comput. Appl. Math. 50:305–323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.

Item Type: Article
Uncontrolled Keywords: Unconstrained optimization; Conjugate gradient method; Line search; Global convergence; Multi-step secant condition
Subjects: Q Science > QA Mathematics
Divisions: Faculty of Science and Health > Mathematical Sciences, Department of
Depositing User: Jim Jamieson
Date Deposited: 09 Dec 2011 14:54
Last Modified: 09 Dec 2011 14:54

Actions (login required)

View Item View Item