Global dissipativity of neural networks with both variable and unbounded delays
Introduction
In the recent years, various neural networks models have been extensively investigated and successfully applied to signal processing, pattern recognition, associative memory and optimization problems. In such applications, it is of prime importance to ensure that the designed neural networks be stable. In hardware implementation, time delays occur due to finite switching speed of the amplifiers and communication time. The time delays may lead to oscillation, divergence, or instability, which may be harmful to a system [1]. On the other hand, it has also been shown that the process of moving images requires the introduction of delay in the signal transmitted through the networks [2]. Therefore, the study of stability of neural networks with delays is practically required, and it has been extensively studied. Usually, constant fixed time delays in the models of delayed feedback systems serve as good approximation in simple circuits having a small number of cells. Though delays arise frequently in practical applications, it is difficult to measure them precisely. In most situations, delays are variable, and in fact unbounded. That is, the entire history affects the present [8]. Such delay terms, more suitable to practical neural nets, are called unbounded delays. Therefore, the studies of neural networks with time-varying delays and unbounded time delays are more important and actual than those with constant delays [13], and the stability of neural networks with variable and/or unbounded delays has received much attention in the literature [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [22], [23], [24], [25], [26], [27], [28], [29], [30]. As pointed out in [21], the global dissipativity is also an important concept in dynamical neural networks. The concept of global dissipativity in dynamical systems is a more general concept and it has found applications in the areas such as stability theory, chaos and synchronization theory, system norm estimation, and robust control [21]. The global dissipativity of several classes of neural networks were discussed, and some sufficient conditions for the global dissipativity of neural networks with constant delays are derived in [21]. To the best of our knowledge, few authors have considered the global dissipativity of neural networks with both variable and unbounded delays. In this paper, we obtain some sufficient conditions ensuring the global dissipativity of neural networks with both variable and unbounded delays, and compare the results with those presented in [21].
Section snippets
Preliminaries
In this paper, we consider the neural network model with both variable and unbounded delays [13]for i = 1, 2, … , n, where n denotes the number of the neurons in the neural networks; xi(t) is the state of the ith neuron at time t; g(x(t)) = (g1(x1(t)), g2(x2(t)), … , gn(xn(t)))T is a vector-valued activation function; A = (aij)n×n, B = (bij)n×n and C = (cij)n×n are connection matrices; U = (u1, u2, … , un)T is the constant
Main results
In this section, we present the following results: Theorem 1 Let , then neural network model (1) is a dissipative system and the set S = S1 ∩ S2 is a positive invariant and globally attractive set, where Proof First, we employ a radically unbounded and positive definite Lyapunov function as Computing along the positive half trajectory of (1),
Comparisons
Some famous neural network models become a special case of model (1). For example, if di = 1, , i = 1, 2, … , n, the model (1) has been studied by Zhang [8], some criteria of stability were given; if cij = 0, τij(t) = τij (τij is constant), i, j = 1, 2, … , n, the dissipativity of model (1) has been studied by Liao and Wang [21], some criteria of dissipativity were given. But the following example show that the dissipativity can be not determined by Theorem 3 of [21], it can be determined by
Conclusions
In this paper, the global dissativity is investigated for neural networks with both variable and unbounded delays. Five theorems are presented to characterize global dissipation and global exponential dissipation together with their sets of attraction. The theorems herein imply that the equilibrium of a neural network lies in the positive invariant and globally attractive set only, the globally asymptotic stability is equivalent to the asymptotic stability in the attractive set. Therefore, our
Acknowledgments
This work was jointly supported by the National Natural Science Foundation of China under Grant 10475026, the key project, Ministry of Education of China under Grant 03051, the Natural Science Foundation of Zhejiang Province, China under Grant M103087, the Department of Education of Zhejiang Province under Grant 20020304, the Natural Science Foundation of Huzhou city, Zhejiang Province, China, under Grant 2004SZX0703.
References (30)
- et al.
Stability in asymmtric Hopfield nets with transmission delays
Physica D
(1994) Global stability of neural networks with distributed delays
Neural Networks
(2002)- et al.
On the stability analysis of delayed neural networks system
Neural Networks
(2001) - et al.
Global exponential convergence analysis of delayed neural networks with time-varying delays
Phys. Lett. A
(2003) - et al.
Global exponential stability of Hopfield neural networks with continuously distributed delays
Phys. Lett. A
(2003) Absolute stability analysis in cellular neural networks with variable delays and unbounded delay
Comput. Math. Appl.
(2004)- et al.
Absolutely exponential stability of a class of neural networks with unbounded delay
Neural Networks
(2004) - et al.
Globally exponentially robust stability and periodicity of delayed neural networks
Chaos, Solitons & Fractals
(2004) - et al.
Global exponential stability and existence of periodic solutions in BAM networks with delays and reaction-diffusion terms
Chaos, Solitons & Fractals
(2005) Global stability of bidirectional associative memory neural networks with distributed delays
Phys. Lett. A
(2002)
Convergence dynamics of hybrid bidirectional associative memory neural networks with distributed delays
Phys. Lett. A
Global robust stability of delayed recurrent neural networks
Chaos, Solitons & Fractals
Global robust stability of interval cellular neural networks with time-varying delays
Chaos, Solitons & Fractals
Absolute exponential stability of recurrent neural networks with time delays and Lipschitz-continuous activation functions
Neural Networks
An estimation of the domain of attraction and convergence rate for Hopfield continuous feedback neural networks
Phys Lett A
Cited by (93)
Stability and stabilization of quaternion-valued neural networks with uncertain time-delayed impulses: Direct quaternion method
2019, Physica A: Statistical Mechanics and its ApplicationsExponential stability for a class of memristive neural networks with mixed time-varying delays
2018, Applied Mathematics and Computation