Skip to main content
Log in

A primal dual modified subgradient algorithm with sharp Lagrangian

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this method is primal convergence, which means that every accumulation point of a primal sequence (which is automatically generated during the process), is a primal solution. This feature is not true in general for available variants of MSG. We propose here two new variants of MSG which enjoy both primal and dual convergence, as long as the dual optimal set is nonempty. These variants have a very simple choice for the stepsizes. Moreover, we also establish primal convergence when the dual optimal set is empty. Finally, our second variant of MSG converges in a finite number of steps.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Burachik R.S., Gasimov R.N., Ismayilova N.A., Kaya C.Y.: On a modified subgradient algorithm for dual problems via sharp Augmented Lagrangian. J. Glob. Optim. 34(1), 55–78 (2006)

    Article  Google Scholar 

  2. Burachik R.S., Kaya C.Y.: An update rule and a convergence result for a penalty function method. J. Ind. Manag. Optim. 3(2), 381–398 (2007)

    Google Scholar 

  3. Burachik, R.S., Kaya, C.Y., Mammadov, M.: An inexact modified subgradient algorithm for nonconvex optimization. Comput. Optim. Appl. (2008) doi:10.1007/s10589-008-9168-7

  4. Burachik R.S., Rubinov A.M.: Abstract convexity and augmented lagrangians. SIAM J. Optim. 18, 413–436 (2007)

    Article  Google Scholar 

  5. Gasimov R.N.: Augmented Lagrangian duality and nondifferentiable optimization methods in nonconvex programming. J. Glob. Optim. 24(2), 187–203 (2002)

    Article  Google Scholar 

  6. Gasimov R.N., Rubinov A.M.: On augmented Lagrangians for optimization problems with a single constraint. J. Glob. Optim. 28(2), 153–173 (2004)

    Article  Google Scholar 

  7. Huang X.X., Yang X.Q.: A unified augmented Lagrangian approach to duality and exact penalization. Math. Oper. Res. 28(3), 533–552 (2003)

    Article  Google Scholar 

  8. Huang X.X., Yang X.Q.: Further study on augmented Lagrangian duality theory. J. Glob. Optim. 31(2), 193–210 (2005)

    Article  Google Scholar 

  9. Nedić A., Ozdaglar A., Rubinov A.M.: Abstract convexity for nonconvex optimization duality. Optimization 56(5–6), 655–674 (2007)

    Google Scholar 

  10. Penot J-P.: Augmented Lagrangians, duality and growth conditions. J. Nonlinear Convex Anal. 3(3), 283–302 (2002)

    Google Scholar 

  11. Rockafellar R.T., Wets R. J.-B.: Variational Analysis. Springer, Berlin (1998)

    Book  Google Scholar 

  12. Rubinov A.M., Yang X.Q.: Lagrange-type Functions in Constrained Non-convex Optimization. Kluwer Academic, Dordrecht, The Netherlands (2003)

    Google Scholar 

  13. Rubinov A.M., Huang X.X., Yang X.Q.: The zero duality gap property and lower semicontinuity of the perturbation function. Math. Oper. Res. 27(4), 775–791 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jefferson G. Melo.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Burachik, R.S., Iusem, A.N. & Melo, J.G. A primal dual modified subgradient algorithm with sharp Lagrangian. J Glob Optim 46, 347–361 (2010). https://doi.org/10.1007/s10898-009-9429-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-009-9429-8

Keywords

Mathematics Subject Classification (2000)

Navigation