Different Types of Three-Term CG-Methods with Sufficient Descent and Conjugacy Conditions


1 Abbas Y. Al-Bayati, 2 Hawraz N. Al-Khayat
1 College of Telafer Basic Education, Mathematics, Mosul University, 2 College of Computer Sciences and Mathematics, Mathematics, Mosul University




Abstract:
It is very important to generate a descent search direction independent of line searches in
showing the global convergence of conjugate gradient methods. Recently, Zhang et al. proposed
a three-term of PR method (TTPR) and HS method (TTHS), both of which can produce
sufficient descent condition. In this paper, we treat two subjects: we first consider new unified
formula of three-term CG algorithm, second we suggested new scaled three-term algorithm
based on Birgin-Martínez algorithm and which satisfied both the descent and conjugacy
conditions are proposed. This algorithms are modification of the Hestenes-Stiefel and Birgin-
Martínez algorithms, also the algorithms could be considered as a modification of the
memoryless BFGS quasi-Newton method. Our algorithms can proved the global convergence
property and more efficiently than HS and BM algorithms in numerical results.

Keywords: Three-Term Conjugate Gradient, Global Convergence, Unconstrained Optimization,
Descent Direction, Conjugacy Condition, Memoryless BFGS.



References

[1] A.Y. Al-Bayati, W.H. Sharif. A new three-term conjugate gradient method for
unconstrained optimization, Can. J. Sci. Eng. Math. 1 (5) (2010), 108–124.
[2] D.C. Liu, J. Nocedal. On the limited memory BFGS method for large scale optimization,
Mathematical Program. 45 (1989) , 503–528.
[3] E. Birgin and J.M. Martínez. A spectral conjugate gradient method for unconstrained
optimization, Applied Mathematics and Optimization, 43 (2001), 117-128.
[4] E. Polak, G. Ribière. Note Sur la Convergence de Directions Conjuguée, Rev. Francaise
Informat Recherche Operationelle, 3e Année, 16 (1969), 35–43.
[5] E.D. Dolan, J.J. Moré, Benchmarking optimization software with performance profiles,
Math. Program. 91 (2002), 201–213.
[6] E.M.L. Beale. A derivative of conjugate gradients, in: F.A. Lootsma (Ed.), Numerical
Methods for Nonlinear Optimization, Academic Press, London, (1972), 39–43.
[7] J. Nocedal. Conjugate gradient methods and nonlinear optimization, In: L. Adams, J.L.
Nazareth, (eds.) (Linear and Nonlinear Conjugate Gradient—Related Methods) ,(1996),
9–23. SIAM, Philadelphia.
[8] J. Zhang, Y. Xiao, Z. Wei. Nonlinear conjugate gradient methods with sufficient descent
condition for large-scale unconstrained optimization, Math. Prob. Eng. (2009), Article ID
243290, 16. DOI: 10.1155/2009/243290.
[9] J.C. Gilbert, J. Nocedal. Global convergence properties of conjugate gradient methods for
optimization, SIAM J. Optim. 2 (1992), 21–42.
[10] L. Grippo, S. Lucidi. A globally convergent version of the Polak–Ribiére conjugate
gradient method, Math. Program. 78 (1997), 375–391.
[11] L. Nazareth. A conjugate direction algorithm without line search, Journal of
Optimization Theory and Applications 23 (1977) 373–387.
[12] L. Zhang, W.J. Zhou, D.H. Li. A descent modified Polak- Ribière -Polyak conjugate
gradient method and its global convergence. IMA Journal of Numerical Analysis , 26
(2006), 629–640.
[13] L. Zhang, W.J. Zhou, D.H. Li. Some descent three-term conjugate gradient methods and
their global convergence. Optimization Methods and Software , 22 (2007), 697– 711.
[14] M. Al-Baali, Descent property and global convergence of the Fletcher–Reeves method
with inexact line search, IMA J. Numer. Anal. 5 (1985), 121–124.
[15] M.R. Hestenes, E. Stiefel. Methods of conjugate gradient for solving linear
system.Journal of Research of the National Bureau of Standards , 49 (1952), 409–436.
[16] N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization,
Applied Mathematical Computational. 213 (2009), 361–369.
[17] N. Andrei. Performance profiles of conjugate gradient algorithms for unconstrained
optimization, In: Floudas, C.A., Pardalos, P.M. (eds.) Encyclopedia of Optimization,
2nd edition. (2009), 2938–2953. Springer, New York.
[18] R. Fletcher, C. Reeves. Function minimization by conjugate gradients, Comput. Journal,
7 (1964), 149–154.
[19] R. Fletcher. Practical Methods of Optimization, vol. I: Unconstrained O ptimization ,2nd
edition. New York, Wiley, (1987).
[20] Y. Hu, C. Storey. Global convergence results for conjugate gradient methods, J. Optim.
Theory Appl. 71 (1991), 399–405.
[21] Y.H. Dai, Y.X. Yuan. A nonlinear conjugate gradient method with a strong global
convergence property. SIAM Journal on Optimization, 10 (1999), 177–182.
[22] Y.-H. Dai and L.-Z. Liao, “New conjugacy conditions and related nonlinear conjugate
gradient methods,” Applied Mathematics and Optimization, vol. 43, no. 1, pp. 87–101, 2001.
[23] Al-Bayati, A. Y., (1991), A new family of self-scaling variable metric algorithms of
unconstrained optimization, J. of Education and Science, University of Mosul, Vol. 12, pp.
25-54.