Abstract
We apply a modified subgradient algorithm (MSG) for solving the dual of a nonlinear and nonconvex optimization problem. The dual scheme we consider uses the sharp augmented Lagrangian. A desirable feature of this method is primal convergence, which means that every accumulation point of a primal sequence (which is automatically generated during the process), is a primal solution. This feature is not true in general for available variants of MSG. We propose here two new variants of MSG which enjoy both primal and dual convergence, as long as the dual optimal set is nonempty. These variants have a very simple choice for the stepsizes. Moreover, we also establish primal convergence when the dual optimal set is empty. Finally, our second variant of MSG converges in a finite number of steps.
Similar content being viewed by others
References
Burachik R.S., Gasimov R.N., Ismayilova N.A., Kaya C.Y.: On a modified subgradient algorithm for dual problems via sharp Augmented Lagrangian. J. Glob. Optim. 34(1), 55–78 (2006)
Burachik R.S., Kaya C.Y.: An update rule and a convergence result for a penalty function method. J. Ind. Manag. Optim. 3(2), 381–398 (2007)
Burachik, R.S., Kaya, C.Y., Mammadov, M.: An inexact modified subgradient algorithm for nonconvex optimization. Comput. Optim. Appl. (2008) doi:10.1007/s10589-008-9168-7
Burachik R.S., Rubinov A.M.: Abstract convexity and augmented lagrangians. SIAM J. Optim. 18, 413–436 (2007)
Gasimov R.N.: Augmented Lagrangian duality and nondifferentiable optimization methods in nonconvex programming. J. Glob. Optim. 24(2), 187–203 (2002)
Gasimov R.N., Rubinov A.M.: On augmented Lagrangians for optimization problems with a single constraint. J. Glob. Optim. 28(2), 153–173 (2004)
Huang X.X., Yang X.Q.: A unified augmented Lagrangian approach to duality and exact penalization. Math. Oper. Res. 28(3), 533–552 (2003)
Huang X.X., Yang X.Q.: Further study on augmented Lagrangian duality theory. J. Glob. Optim. 31(2), 193–210 (2005)
Nedić A., Ozdaglar A., Rubinov A.M.: Abstract convexity for nonconvex optimization duality. Optimization 56(5–6), 655–674 (2007)
Penot J-P.: Augmented Lagrangians, duality and growth conditions. J. Nonlinear Convex Anal. 3(3), 283–302 (2002)
Rockafellar R.T., Wets R. J.-B.: Variational Analysis. Springer, Berlin (1998)
Rubinov A.M., Yang X.Q.: Lagrange-type Functions in Constrained Non-convex Optimization. Kluwer Academic, Dordrecht, The Netherlands (2003)
Rubinov A.M., Huang X.X., Yang X.Q.: The zero duality gap property and lower semicontinuity of the perturbation function. Math. Oper. Res. 27(4), 775–791 (2002)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Burachik, R.S., Iusem, A.N. & Melo, J.G. A primal dual modified subgradient algorithm with sharp Lagrangian. J Glob Optim 46, 347–361 (2010). https://doi.org/10.1007/s10898-009-9429-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-009-9429-8
Keywords
- Nonsmooth optimization
- Nonconvex optimization
- Duality scheme
- Sharp Lagrangian
- Modified subgradient algorithm