Skip to main content
Log in

Gradient Methods with Adaptive Step-Sizes

  • Regular Papers
  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Motivated by the superlinear behavior of the Barzilai-Borwein (BB) method for two-dimensional quadratics, we propose two gradient methods which adaptively choose a small step-size or a large step-size at each iteration. The small step-size is primarily used to induce a favorable descent direction for the next iteration, while the large step-size is primarily used to produce a sufficient reduction. Although the new algorithms are still linearly convergent in the quadratic case, numerical experiments on some typical test problems indicate that they compare favorably with the BB method and some other efficient gradient methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. H.Akaike, “On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method,” Ann. Inst. Statist. Math., Tokyo, vol. 11, pp. 1–17, 1959.

    Article  MATH  MathSciNet  Google Scholar 

  2. R.Andreani, E.G.Birgin, J.M.Martínez, and J.Yuan, “Spectral projected gradient and variable metric methods for optimization with linear inequalities,” IMA J.Numer. Anal., vol. 25, pp. 221–252, 2005.

    Article  MATH  MathSciNet  Google Scholar 

  3. J.Barzilai and J.M.Borwein, “Two-point step size gradient methods,” IMA J. Numer. Anal., vol. 8, pp. 141–148, 1988.

    MATH  MathSciNet  Google Scholar 

  4. L.Bello and M.Raydan, “Preconditioned spectral projected gradient methods on convex sets,” Journal of Computational Mathematics, vol. 23, pp. 225–232, 2005.

    MATH  MathSciNet  Google Scholar 

  5. E.G.Birgin, J.M.Martínez, and M.Raydan, “Nonmonotone spectral projected gradient methods on convex sets,” SIAM J. Optim., vol. 10, pp. 1196–1211, 2000.

    Article  MATH  MathSciNet  Google Scholar 

  6. E.G.Birgin, J.M.Martínez, and M.Raydan, “Algorithm 813: SPG—software for convex-constrained optimization,” ACM Trans. Math. Software, vol. 27, pp. 340–349, 2001.

    Article  MATH  Google Scholar 

  7. E.G.Birgin, J.M.Martínez, and M.Raydan, “Inexact spectral projected gradient methods on convex sets,” IMA J. Numer. Anal., vol. 23, pp. 539–559, 2003.

    Article  MATH  MathSciNet  Google Scholar 

  8. A.Cauchy, “Méthode générale pour la résolution des syst è ms d’équations simultanées,” Comp. Rend. Sci. Paris, vol. 25, pp. 536–538, 1847.

    Google Scholar 

  9. Y.H.Dai, “Alternate step gradient method,” Optimization, vol. 52, pp. 395–415, 2003.

    Article  MATH  MathSciNet  Google Scholar 

  10. Y.H.Dai and L.Z.Liao, “R-linear convergence of the Barzilai and Borwein gradient method,” IMA J. Numer. Anal., vol. 22, pp. 1–10, 2002.

    Article  MATH  MathSciNet  Google Scholar 

  11. Y.H.Dai and Y.Yuan, “Alternate minimization gradient method,” IMA J. Numer. Anal., vol. 23, pp. 377–393, 2003.

    Article  MATH  MathSciNet  Google Scholar 

  12. R.Fletcher, “Low storage methods for unconstrained optimization,” Lectures in Applied Mathematics (AMS), vol. 26, pp. 165–179, 1990.

    MATH  MathSciNet  Google Scholar 

  13. R.Fletcher, “On the Barzilai-Borwein method,” Numerical Analysis Report NA/207, Department of Mathematics, University of Dundee, 2001.

  14. A.Friedlander, J.M. Martínez, B.Molina, and M.Raydan, “Gradient method with retards and generalizations,” SIAM J. Numer. Anal., vol. 36, pp. 275–289, 1999.

    Article  MathSciNet  Google Scholar 

  15. L.Grippo, F.Lampariello, and S.Lucidi, “A nonmonotone line search technique for Newton’s method,” SIAM J. Numer. Anal., vol. 23, pp. 707–716, 1986.

    Article  MATH  MathSciNet  Google Scholar 

  16. M.R.Hestenes and E.L.Stiefel, “Methods of conjugate gradients for solving linear systems,” J. Research National Bureau of Standards, vol. B49, pp. 409–436, 1952.

    MathSciNet  Google Scholar 

  17. W.LaCruz, J.M.Martínez, and M.Raydan, “Spectral residual method without gradient information for solving large-scale nonlinear systems of equations,” Mathematics of Computation, to appear.

  18. W.LaCruz and M.Raydan, “Nonmonotone spectral methods for large-scale nonlinear systems,” Optimization Methods and Software, vol. 18, pp. 583–599, 2003.

    Article  MathSciNet  Google Scholar 

  19. J.-L.Lamotte, B.Molina, and M.Raydan, “Smooth and adaptive gradient method with retards,” Mathematical and Computer Modelling, vol. 36, pp. 1161–1168, 2002.

    Article  MATH  MathSciNet  Google Scholar 

  20. F.Luengo and M.Raydan, “Gradient method with dynamical retards for large-scale optimization problems,” Electronic Transactions on Numerical Analysis, vol. 16, pp. 186–193, 2003.

    MATH  MathSciNet  Google Scholar 

  21. M.Raydan, “On the Barzilai and Borwein choice of steplength for the gradient method,” IMA J. Numer. Anal., vol. 13, pp. 321–326, 1993.

    MATH  MathSciNet  Google Scholar 

  22. M.Raydan, “The Barzilai and Borwein method for the large scale unconstrained minimization problem,” SIAM J. Optim., vol. 7, pp. 26–33, 1997.

    Article  MATH  MathSciNet  Google Scholar 

  23. M.Raydan and B.F.Svaiter, “Relaxed steepest descent and Cauchy-Barzilai-Borwein method,” Computational Optimization and Applications, vol. 21, pp. 155–167, 2002.

    Article  MATH  MathSciNet  Google Scholar 

  24. T.Serafini, G.Zanghirati, and L.Zanni, “Gradient projection methods for quadratic programs and applications in training support vector machines,” Tech. Rep. 48, University of Modena and Reggio Emilia, Italy, 2003.

  25. H.Zhang and W.W. Hager, “A nonmonotone line search technique and its application to unconstrained optimization,” SIAM J. Optim., vol. 14, pp. 1043–1056, 2004.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bin Zhou.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhou, B., Gao, L. & Dai, YH. Gradient Methods with Adaptive Step-Sizes. Comput Optim Applic 35, 69–86 (2006). https://doi.org/10.1007/s10589-006-6446-0

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-006-6446-0

Keywords

Navigation