A smoothing Newton method for second-order cone optimization based on a new smoothing function

https://doi.org/10.1016/j.amc.2011.06.015Get rights and content

Abstract

A new smoothing function is given in this paper by smoothing the symmetric perturbed Fischer–Burmeister function. Based on this new smoothing function, we present a smoothing Newton method for solving the second-order cone optimization (SOCO). The method solves only one linear system of equations and performs only one line search at each iteration. Without requiring strict complementarity assumption at the SOCO solution, the proposed algorithm is shown to be globally and locally quadratically convergent. Numerical results demonstrate that our algorithm is promising and comparable to interior-point methods.

Introduction

Second-order cone optimization (SOCO) problem is convex optimization problem in which a linear function is minimized over the intersection of an affine linear manifold with the Cartesian product of second-order cones. In this paper we consider SOCO in standard format(P)min{cTx:Ax=b,K},and the dual problem of (P) is given by(D)max{bTy:ATy+s=c,sK},where ARm×n,cRn and bRm, and KRn is the Cartesian product of second-order cones, i.e.,K=Kn1×Kn2××Knr,with n1 + n2 +  + nr = n, and the ni-dimensional second-order cone (SOC) Kni is defined byKni{(xi1;x¯i)R×Rni-1:xi1x¯i},where ∥·∥ refers to the Euclidean norm of vectors. The interior of SOC Kni is given byK+ni{(xi1;x¯i)R×Rni-1:xi1>x¯i,xi1>0}.For simplicity, we use “;” to join vectors in a column. Thus, for instance, for vectors x, y, and z we use (x; y; z) to represent (xT, yT, zT)T.

Without loss of generality, in this paper, we assume that r = 1 and n1 = n in the subsequent analysis, since our analysis can be easily extended to general cases.

Throughout the paper, we make the following assumptions:

Assumption 1.1

Both (P) and its dual (D) are strictly feasible.

Under Assumption 1.1, it is well-known that both (P) and (D) have optimal solutions and their optimal values coincide [1], and the SOCO is equivalent to its optimality conditions:Ax=b,ATy+s=c,xs=0,x,sK,yRm,where x  s = 0 is usually referred to the complementary condition.

SOCO problems include linear optimization problems, convex quadratic optimization problems and quadratically constrained convex quadratic optimization problems as special cases [1]. In recent years, the SOCO problem has received considerable attention from researchers for its wide range of applications in many fields, such as engineering, optimal control and design, machine learning, robust optimization and combinatorial optimization and so on (see, e.g., [8], [16], [21], [28], [30]). There are an extensive literatures focusing on interior-point methods (IPMs) for solving SOCO (see, e.g., [2], [10], [11], [15], [20], [22], [29], [34] and references therein). IPMs typically deal with the following perturbation of the optimality conditions systemAx=b,ATy+s=c,xs=μe,x,sK,yRm,where μ > 0 and e(1;0)R×Rn-1. This set of conditions are called the central path conditions since they define a trajectory approaching the solution set as μ  0. Conventional IPMs usually apply a Newton-type method to the equations in (4) with a suitable line search dealing with constraints xK and sK explicitly

Recently great attention has been paid to smoothing Newton methods partially due to their encouraging convergent properties and numerical results (e.g., [3], [4], [12], [13], [17], [18], [23], [24], [25]). However, some algorithms [3], [24] strongly depend on the assumptions of uniform nonsingularity and strict complementarity to obtain the global convergence and local superlinear (or quadratic) convergence. Without the uniform nonsingularity assumption, the algorithm given in [33] usually needs to solve two linear systems of equations and to perform at least two line searches per iteration. Qi et al. [25] proposed a class of new smoothing Newton methods for nonlinear complementarity problems and box constrained variational inequalities under a nonsingularity assumption, which was shown to be locally superlinearly/quadratically convergent without strict complementarity. Jiang [14] gave another technique to reformulate the nonlinear complementarity problem as a square system of equations. The main feature in the method [14] is the introduction of the additional function eμ  1. Lastly, by modifying and extending the method of Qi et al. [25], Chi and Liu [6] proposed a one-step smoothing Newton method for the SOCO. The method in [6] can start from an arbitrary point and it is quadratically convergent without strict complementarity. Chi and Liu [7] also presented a non-interior continuation method for the SOCO based on a new smoothing function. Fang et al. [9] gave another smoothing Newton-type method for the SOCO and proved the quadratical convergence of the algorithm without strict complementarity.

Motivated by their work, in this paper we introduce a new smoothing function by smoothing the symmetric perturbed Fischer–Burmeister function. Based on this smoothing function, we propose a smoothing Newton method for solving the SOCO. The new smoothing algorithm is based on the perturbed optimality conditions (4), and the main difference from IPMs is that we reformulate (4) as a nonlinear system of equations and then apply Newton’s method to this system. It is shown that our algorithm has the following good properties:

  • (i)

    The algorithm is well-defined and a solution of SOCO can be obtained from any accumulation point of the iteration sequence generated by the algorithm.

  • (ii)

    The algorithm can start from an arbitrary point, and does not require the initial point and iteration points to be in the sets of strictly feasible solutions.

  • (iii)

    It needs to solve only one system of linear equations and to perform only one line search at each iteration.

  • (iv)

    The global and local quadratic convergence of the algorithm are obtained without strict complementarity.

The paper is organized as follows. In the next section, we briefly introduce the Euclidean Jordan algebra associated with the SOC which will be used in the subsequent sections. Based on Fischer–Burmeister function, a new smoothing function and its properties are given in Section 3. In Section 4, we present a smoothing Newton method for solving the SOCO and state some preliminary results. The global convergence and locally quadratic convergence of the algorithm are investigated in Section 5. Preliminary numerical results are reported in Section 6. The conclusions are given in Section 7.

Some notations used throughout the paper are as follows. Rn,R+n and R++n denote the set of vectors with n components, the set of nonnegative vectors and the set of positive vectors, respectively. Rn×Rm is identified with Rn+m. I represents the identity matrix with suitable dimension. ∥·∥ denotes the 2-norm of the vector x defined by x=xTx. For any α, β > 0, α = O(β) (respectively, α = o(β)) means that α/β is uniformly bounded (respectively, tends to zero) as β  0. For any x,yRn, we write xKy or yKx (respectively, xKy or yKx) if x-yK or y-xK (respectively, x-yK+ or y-xK+). For any square matrices ARn×n, we write A  0 (respectively, A  0) if the symmetric part of A is positive semi-definite (respectively, positive definite).

Section snippets

Euclidean Jordan algebra associated with the SOC

Smoothing Newton methods for the SOCO are based on the Euclidean Jordan algebra associated with the SOC [1], [11]. For any vectors x=(x1;x¯),s=(s1;s¯)R×Rn-1, their Jordan product associated with SOC K is defined byxs(xTs;x1s¯+s1x¯).One easily checks that (Rn,) is an Euclidean Jordan algebra with the vectore(1;0;;0)as identity element. We write x2 to mean x  x and write x + y to mean the usual componentwise addition of vectors. Then, we have the following basic properties.

Property 2.1

[1]

One has

  • (a)

    xe=x,xRn;

  • (b)

    x

A new smoothing function and its properties

In this subsection, we give an new smoothing function and its properties. One popular choice of SOCO-function is the Fischer–Burmeister (FB) function ϕFB(x,s):Rn×RnRndefined byϕFB(x,s)=x+s-x2+s2.In [12], it has been shown that ϕFB(x, s) satisfies the following important propertyϕFB(x,s)=0xK0,sK0,xs=0.The FB function has many interesting properties. In [23], Pan and Chen showed that ϕFB is semismooth which leads to a semismooth Newton method for second order cone complementarity problems

Algorithm description

Based on the smoothing function defined by (11), we propose a smoothing Newton method for the SOCO and subsequently we show the well-definiteness of the algorithm.

Let z(μ,x,y)R++×Rn×Rm. By using the smoothing function (11), we define the function H(z):R++×Rn×RmR++×Rm×Rn byH(z)=eμ-1b-Axϕ(μ,x,c-ATy).

In view of (10), (19), z = (μ, x, y) is a solution of the system H(z) = 0 if and only if z = (x, y, c  ATy) satisfies the optimality conditions (3). Therefore, z is a solution of H(z) = 0 if and only

Convergence analysis

In this section, we show that any accumulation point of the iteration sequence {zk≔(μk, xk, yk)} is a solution of the system H(z) = 0. If the accumulation point z satisfies a nonsingularity assumption, then the iteration sequence converges to z locally quadratically without strict complementarity. Firstly, we can prove the global convergence of Algorithm 4.1. We have the following results.

Theorem 5.1 Global convergence

Suppose that A has full row rank and that {zk} is the iteration sequence generated by Algorithm 4.1. Then any

Numerical results

In order to evaluate the efficiency of Algorithm 4.1, we have conducted some numerical experiments. All experiments were performed on a personal computer (IBM R40e) with 512 MB memory and Intel (R) Pentium (R) 4 CPU 2.00 GHz. The operating system was Windows XP (SP2) and the implementations were done in MATLAB 7.0.1.

In all these experiments, we choose x0=eRn,y0=0Rm as the starting point. The parameters used in Algorithm 4.1 were μ0 = 0.01, σ = 0.35, δ = 0.75, γ = 0.65. We used ∥H(zk)∥  10−6 as the

Conclusions

In this paper, we propose a new smoothing function by smoothing the symmetric perturbed Fischer–Burmeister function. Based on the new smoothing function, we present a smoothing Newton algorithm for solving the SOCO. The global convergence and locally quadratic convergence of the algorithm are proved without strict complementarity. Numerical results show that our algorithm performs well.

It is obvious that the function eμ  1 plays an important role in analyzing our algorithm. One may find other

Acknowledgements

This work was supported by National Natural Science Foundation of China (10571109, 10971122), Natural Science Foundation of Shandong Province (Y2008A01) and Specialized Research Foundation for the Doctoral Program of Higher Education (20093718110005).

The authors thank the anonymous referees for their valuable comments and suggestions on the paper, which have considerably improved the paper.

References (34)

  • F.H. Clarke, Optimization and Nonsmooth Analysis, Wiley, New York, 1983. Reprinted by SIAM, Philadelphia,...
  • X.N. Chi et al.

    A non-interior continuation method for second-order cone programming

    Optimization

    (2009)
  • R. Debnath et al.

    An efficient support vector machine learning method with second-order cone programming for large-scale problems

    Applied Intelligence

    (2005)
  • J. Faraut et al.

    Analysis on Symmetric Cones

    (1994)
  • L. Faybusovich

    Euclidean Jordan algebras and interior-point algorithms

    Positivity

    (1997)
  • M. Fukushima et al.

    Smoothing functions for second-order-cone complementarity problems

    SIAM Journal on Optimization

    (2002)
  • S. Hayashi et al.

    A combined smoothing and regularized method for monotone second-order cone complementarity problems

    SIAM Journal on Optimization

    (2005)
  • Cited by (20)

    • A globally and quadratically convergent smoothing newton method for solving second-order cone optimization

      2015, Applied Mathematical Modelling
      Citation Excerpt :

      Under mild assumptions, we prove that the proposed method is globally and locally quadratically convergent. To compare with existing smoothing methods for the SOCO (e.g., [6–8,10,19]), our method has the following special properties. It is based on a non-symmetrically perturbed smoothing function, while existing smoothing methods (e.g., [6–8,10,19]) were all designed by some symmetrically perturbed smoothing functions.

    • A new non-interior continuation method for solving the second-order cone complementarity problem

      2014, Applied Mathematics and Computation
      Citation Excerpt :

      A non-interior continuation method for the SOCCP A similar algorithmic framework was originally introduced by Qi et al. [26] and has been extensively studied for solving the cone optimization problems (e.g., [4,8–10,14,18,29–31]). In Algorithm 4.1, we give two step lengths, in which Step 3 is the classical Armijo line search which has been extensively used in smoothing-type methods, while Step 3′ is a new search rule.

    • Improved Gauss Seidel based projection–contraction algorithm for the mixed complementarity problem in contact problem

      2022, International Journal for Numerical and Analytical Methods in Geomechanics
    View all citing articles on Scopus
    View full text