Different Types of Three-Term CG-Methods with Sufficient Descent and Conjugacy Conditions


  • Abbas Y. Al-Bayati College of Telafer Basic Education, Mathematics, Mosul University, Kurdistan Region, Iraq. Author
  • Hawraz N. Al-Khayat College of Computer Sciences and Mathematics, Mathematics, Mosul University, Kurdistan Region, Iraq. Author




Three-Term Conjugate Gradient, Global Convergence, Unconstrained Optimization, Descent Direction, Conjugacy Condition, Memoryless BFGS


It is very important to generate a descent search direction independent of line searches in showing the global convergence of conjugate gradient methods. Recently, Zhang et al. proposed a three-term of PR method (TTPR) and HS method (TTHS), both of which can produce sufficient descent condition. In this paper, we treat two subjects: we first consider new unified formula of three-term CG algorithm, second we suggested new scaled three-term algorithm based on Birgin-Martínez algorithm and which satisfied both the descent and conjugacy conditions are proposed. This algorithms are modification of the Hestenes-Stiefel and Birgin- Martínez algorithms, also the algorithms could be considered as a modification of the memoryless BFGS quasi-Newton method. Our algorithms can proved the global convergence property and more efficiently than HS and BM algorithms in numerical results.


A.Y. Al-Bayati, W.H. Sharif. A new three-term conjugate gradient method for

unconstrained optimization, Can. J. Sci. Eng. Math. 1 (5) (2010), 108–124.

D.C. Liu, J. Nocedal. On the limited memory BFGS method for large scale optimization,

Mathematical Program. 45 (1989) , 503–528. DOI: https://doi.org/10.1007/BF01589116

E. Birgin and J.M. Martínez. A spectral conjugate gradient method for unconstrained

optimization, Applied Mathematics and Optimization, 43 (2001), 117-128. DOI: https://doi.org/10.1007/s00245-001-0003-0

E. Polak, G. Ribière. Note Sur la Convergence de Directions Conjuguée, Rev. Francaise

Informat Recherche Operationelle, 3e Année, 16 (1969), 35–43. DOI: https://doi.org/10.1051/m2an/196903R100351

E.D. Dolan, J.J. Moré, Benchmarking optimization software with performance profiles,

Math. Program. 91 (2002), 201–213. DOI: https://doi.org/10.1007/s101070100263

E.M.L. Beale. A derivative of conjugate gradients, in: F.A. Lootsma (Ed.), Numerical

Methods for Nonlinear Optimization, Academic Press, London, (1972), 39–43.

J. Nocedal. Conjugate gradient methods and nonlinear optimization, In: L. Adams, J.L.

Nazareth, (eds.) (Linear and Nonlinear Conjugate Gradient—Related Methods) ,(1996),

–23. SIAM, Philadelphia.

J. Zhang, Y. Xiao, Z. Wei. Nonlinear conjugate gradient methods with sufficient descent

condition for large-scale unconstrained optimization, Math. Prob. Eng. (2009), Article ID

, 16. DOI: 10.1155/2009/243290. DOI: https://doi.org/10.1155/2009/243290

J.C. Gilbert, J. Nocedal. Global convergence properties of conjugate gradient methods for

optimization, SIAM J. Optim. 2 (1992), 21–42. DOI: https://doi.org/10.1137/0802003

L. Grippo, S. Lucidi. A globally convergent version of the Polak–Ribiére conjugate

gradient method, Math. Program. 78 (1997), 375–391. DOI: https://doi.org/10.1007/BF02614362

L. Nazareth. A conjugate direction algorithm without line search, Journal of

Optimization Theory and Applications 23 (1977) 373–387. DOI: https://doi.org/10.1007/BF00933447

L. Zhang, W.J. Zhou, D.H. Li. A descent modified Polak- Ribière -Polyak conjugate

gradient method and its global convergence. IMA Journal of Numerical Analysis , 26

(2006), 629–640.

L. Zhang, W.J. Zhou, D.H. Li. Some descent three-term conjugate gradient methods and

their global convergence. Optimization Methods and Software , 22 (2007), 697– 711. DOI: https://doi.org/10.1080/10556780701223293

M. Al-Baali, Descent property and global convergence of the Fletcher–Reeves method

with inexact line search, IMA J. Numer. Anal. 5 (1985), 121–124. DOI: https://doi.org/10.1093/imanum/5.1.121

M.R. Hestenes, E. Stiefel. Methods of conjugate gradient for solving linear

system.Journal of Research of the National Bureau of Standards , 49 (1952), 409–436. DOI: https://doi.org/10.6028/jres.049.044

N. Andrei, Acceleration of conjugate gradient algorithms for unconstrained optimization,

Applied Mathematical Computational. 213 (2009), 361–369. DOI: https://doi.org/10.1016/j.amc.2009.03.020

N. Andrei. Performance profiles of conjugate gradient algorithms for unconstrained

optimization, In: Floudas, C.A., Pardalos, P.M. (eds.) Encyclopedia of Optimization,

nd edition. (2009), 2938–2953. Springer, New York.

R. Fletcher, C. Reeves. Function minimization by conjugate gradients, Comput. Journal,

(1964), 149–154.

R. Fletcher. Practical Methods of Optimization, vol. I: Unconstrained O ptimization ,2nd

edition. New York, Wiley, (1987).

Y. Hu, C. Storey. Global convergence results for conjugate gradient methods, J. Optim.

Theory Appl. 71 (1991), 399–405. DOI: https://doi.org/10.1007/BF00939927

Y.H. Dai, Y.X. Yuan. A nonlinear conjugate gradient method with a strong global

convergence property. SIAM Journal on Optimization, 10 (1999), 177–182. DOI: https://doi.org/10.1137/S1052623497318992

Y.-H. Dai and L.-Z. Liao, “New conjugacy conditions and related nonlinear conjugate

gradient methods,” Applied Mathematics and Optimization, vol. 43, no. 1, pp. 87–101, 2001. DOI: https://doi.org/10.1007/s002450010019

Al-Bayati, A. Y., (1991), A new family of self-scaling variable metric algorithms of

unconstrained optimization, J. of Education and Science, University of Mosul, Vol. 12, pp.




How to Cite

Different Types of Three-Term CG-Methods with Sufficient Descent and Conjugacy Conditions. (2014). Journal of Zankoy Sulaimani - Part A, 16(2), 27-45. https://doi.org/10.17656/jzs.10291

Most read articles by the same author(s)

1 2 3 4 5 6 7 8 9 10 > >>