Skip to main content
Log in

A convergent decomposition method for box-constrained optimization problems

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this work we consider the problem of minimizing a continuously differentiable function over a feasible set defined by box constraints. We present a decomposition method based on the solution of a sequence of subproblems. In particular, we state conditions on the rule for selecting the subproblem variables sufficient to ensure the global convergence of the generated sequence without convexity assumptions. The conditions require to select suitable variables (related to the violation of the optimality conditions) to guarantee theoretical convergence properties, and leave the degree of freedom of selecting any other group of variables to accelerate the convergence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Bertsekas D.P.: Nonlinear Programming. 2nd edn. Athena Scientific, New York (1999)

    MATH  Google Scholar 

  2. Facchinei F., Lucidi S., Palagi L.: A truncated Newton algorithm for large scale box constrained optimization. SIAM J. Optim. 12, 1100–1125 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  3. Grippo L., Sciandrone M.: On the convergence of the block nonlinear Gauss–Seidel method under convex constraints. Oper. Res. Lett. 26, 127–136 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  4. Han, C.G., Pardalos, P.M., Ye, Y.: Computational aspects of an interior point algorithm for quadratic programming problems with box constraints. Large Scale Numer. Optim. 92–112 (1990)

  5. Hsu C.-W., Lin C.-J.: A simple decomposition method for support vector machines. Mach. Learn. 46, 291–314 (2002)

    Article  MATH  Google Scholar 

  6. Joachims T.: Making large scale SVM learning practical. In: Schölkopf, C.B.B., Smola, A. (eds) Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge (1998)

    Google Scholar 

  7. Lin C.-J., Moré J.J.: Newton’s method for large bound-constrained optimization problems. SIAM J. Optim. 9, 1100–1127 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  8. Lin C.-J.: On the convergence of the decomposition method for support vector machines. IEEE Trans. Neural Netw. 12, 1288–1298 (2001)

    Article  Google Scholar 

  9. Lin, C.-J., Lucidi, S., Palagi, L., Risi, A., Sciandrone, M.: A decomposition algorithm model for singly linearly constrained problems subject to lower and upper bounds. J. Optim. Theory Appl. (2009) (to appear)

  10. Luo Z.Q., Tseng P.: On the convergence of the coordinate descent method for convex differentiable minimization. J. Optim. Theory Appl. 72, 7–35 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  11. Moré J.J., Toraldo G.: Algorithms for bound constrained quadratic programming problems. Numerische Mathematik 55, 377–400 (1988)

    Article  Google Scholar 

  12. Pardalos P.M., Resende M.G.C.: Handbook of Applied Optimization. Oxford University Press, New York (2002)

    MATH  Google Scholar 

  13. Pardalos, P.M., Kovoor, N.: An algorithm for a singly constrained class of quadratic programs subject to upper and lower bounds. In: Mathematical Programming. Springer, Heidelberg, vol. 46-1, pp. 321–328 (1990)

  14. Tseng P., Yun S.: A coordinate descent method for nonsmooth separable minimization. Math. Program. B 117, 387–423 (2009)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marco Sciandrone.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cassioli, A., Sciandrone, M. A convergent decomposition method for box-constrained optimization problems. Optim Lett 3, 397–409 (2009). https://doi.org/10.1007/s11590-009-0119-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-009-0119-8

Keywords

Navigation