A smoothing conjugate gradient method for solving systems of nonsmooth equations
Introduction
Suppose that is continuous but not necessarily continuously differentiable. In this paper, we consider the following system of nonsmooth equations:Throughout this paper, we assume (1) has at least one solution. There are many applications of the system of nonsmooth equations, for example, the variational inequality and the complementarity problems, involving many equilibrium problems in economics, can be cast into (1). Accordingly, numerical methods for solving (1) have been studied by many researchers (see [2], [8], [10], [11], [12], [13], for example).
Usually, to solve systems of equations, gradient methods, like as Newton’s method or Newton-like methods, are widely used. However, such gradient methods cannot be applied to the problem (1), and thus smoothing methods are often used. Smoothing methods are based on the smoothing function defined by the following. Definition 1.1 A function is said to be a smoothing function of F when is continuously differentiable on and satisfiesfor any x.
Defining a function bywe have only to solve the system of equations , instead of (1). Moreover, we define merit function bywhere denotes the -norm. Then (1) is equivalent to finding a global minimizer of the unconstrained optimization problem:Note that Ψ is continuously differentiable on , but not necessarily continuously differentiable on the other region (namely, ). Many researchers have proposed Newton’s method or Newton-like methods based on (3), (4), and those are reviewed in [13].
On the other hand, we the (nonlinear) conjugate gradient method is an iterative method for solving the unconstrained optimization problem:where is a continuously differentiable function and its gradient is available. Conjugate gradient methods are consist of the form:where is the kth approximation to a solution, is a positive step size and is a search direction. Since conjugate gradient methods do not need the storage of matrices, they are paid attention to as an effective method for solving large-scale unconstrained optimization problems. It is known that choices of affect numerical performance of the method, and hence many researchers have studied effective choices of (see [1], [3], [4], [5], [6], for example). Recent development of conjugate gradient methods and their global convergence properties is reviewed by Hager and Zhang [7]. There is a weakness of conjugate gradient methods, namely, most of conjugate gradient methods do not necessarily satisfy the descent condition . Recently, to overcome this weakness, some researchers proposed three-term conjugate gradient methods which always generate descent search directions [9], [14], [15].
As mentioned above, many researchers have studied the smoothing Newton methods. However, these methods need to store some matrices, and hence these methods cannot necessarily be applied to large-scale problems. In this paper, to develop an algorithm for solving large-scale problems, we incorporate the smoothing technique into the three-term conjugate gradient method given by Zhang et al. [14], and propose a smoothing conjugate gradient method, which can solve (4), and hence (1), without storing any matrices. To this end, we recall here the search direction of the three-term conjugate gradient method given by Zhang et al. [14]:whereIt can be easily verified that this search direction always satisfies the sufficient descent condition in the sense that whenever .
This paper is organized as follows. In Section 2, we propose a smoothing conjugate gradient method for solving (1), based on (5). In Section 3, we prove the global convergence property of the proposed method. Finally, in Section 4, some preliminary numerical results are presented.
Now we give some notations and relations which will be used in the subsequent sections. We denote the matrix -norm corresponding to the vector -norm by . For any real matrix A such that A is nonsingular, we can expressed aswhere and are the largest and smallest singular values Especially, when of A. For any Fréchet-differentiable mapping , we denote its transposed Jacobian at by means the gradient vector of G at x. The gradient of a smoothing function of F is given byAlso, by (3), we haveNote that is a row vector, and hence is a scaler, while is a column vector. We often write instead of , and for simplicity, we put .
Section snippets
Algorithm
In this section, we propose a smoothing conjugate gradient method for solving (4). The smoothing conjugate gradient method is an iterative method of the form:where is the kth approximation to a solution of (4), is a positive step size and is a search direction. Similar to , we use the symbol . In addition, we often use the conventional abbreviation:and we adopt the same manner for the other functions.
As mentioned in Section 1, is not
Global convergence
In this section, we prove the global convergence of Algorithm SCG. The rest of this section, we assume for all k, without loss of generality. We first prove the following lemma. Lemma 3.1 Suppose that Assumption 2.1 holds. If , then the following holds: Proof First, note from Remark 2 that is bounded and there exists at least one accumulation point. We also note that, since is bounded below and decreasing and satisfies , has the positive limit .
Numerical results
In this section, we present some preliminary numerical results of Algorithm SCG. The program was coded in MATLAB R2009b, and computations were carried out on FUJITSU FMV Esprimo (Intel Core2 Duo 2.40 GHz 2) with 4.0 GB RAM. Method we tested are as follows:SCG : Algorithm SCG. SCG-q : Algorithm SCG with the quadratic interpolation. SNewton : The smoothing Newton method by Qi et al. [12], [13]. Hybrid : A hybrid method with SCG and SNewton. Hybrid-q : A hybrid method with SCG-q and SNewton.
We compared the
Conclusion
In this paper, we have proposed a smoothing method for solving systems of nonsmooth equations based on the three-term conjugate gradient method (5). This method treats the smoothing parameter as a variable. We have proven the global convergence of the method under some standard assumptions. Finally, we have given some numerical results of our method, have confirmed that the proposed method is efficient for solving systems of nonsmooth equations.
Acknowledgments
The author would like to express appreciation to Dr. Hideho Ogasawara of Tokyo University of Science for his valuable comments to accomplish this paper. The author is supported in part by the Grant-in-Aid for Scientific Research (C) 21510164 of Japan Society for the Promotion of Science.
References (15)
- et al.
A nonlinear conjugate gradient method with a strong global convergence property
SIAM Journal on Optimization
(1999) - et al.
Finite-Dimensional Variational Inequalities and Complementarity Problems
(2003) - et al.
Function minimization by conjugate gradients
The Computer Journal
(1964) - et al.
Multi-step nonlinear conjugate gradient methods for unconstrained minimization
Computational Optimization and Applications
(2008) - et al.
Global convergence properties of conjugate gradient methods for optimization
SIAM Journal on Optimization
(1992) - et al.
A new conjugate gradient method with guaranteed descent and an efficient line search
SIAM Journal on Optimization
(2005) - et al.
A survey of nonlinear conjugate gradient methods
Pacific Journal of Optimization
(2006)
Cited by (7)
Signal recovery with convex constrained nonlinear monotone equations through conjugate gradient hybrid approach
2021, Mathematics and Computers in SimulationCitation Excerpt :The convergence analysis of these methods is demonstrated under certain appropriate conditions. Narushima, once more, presented a smoothing CG algorithm in [37] with descent direction, that combined the smoothing technique with the PRP method in [31] to solve nonsmooth equations for unconstrained optimization problem. In recent decades, unconstrained optimization problems have been extended to solve the nonlinear system of equations.
Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery
2019, Results in Applied MathematicsCitation Excerpt :However, not all CG methods generate such descent search direction always. To obtain descent directions, Narushima et al. [17] and Zhang et al. [30] developed three term CG methods that generates a descent direction, and prove the convergence under some suitable conditions. Again in Ref. [18], Narushima presented a smoothing CG algorithm, which combine the smoothing technique with the Polak–Ribière–Polyak CG method in Ref. [29], to solve unconstrained non-smooth equations.
A new nonmonotone spectral residual method for nonsmooth nonlinear equations
2017, Journal of Computational and Applied MathematicsCitation Excerpt :We will report the computational performance of the new method as it is implemented to solve large scale nonsmooth system of equations. The set of test problems was described in [7,24]. All of the computer codes are written in MATLAB, and the numerical experiments are done on a PC with 3.39 GHz CPU, 3.47 GB memory.
Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum
2023, Journal of Scientific ComputingTwo modifications of efficient Newton-type iterative method and two variants of Super-Halley's method for solving nonlinear equations
2019, Journal of Computational Methods in Sciences and Engineering