Global dissipativity of neural networks with both variable and unbounded delays

https://doi.org/10.1016/j.chaos.2004.11.035Get rights and content

Abstract

In this paper, the dissipativity of neural networks with both variable and unbounded delays is investigated. By constructing proper Lyapunov functions and using some analytic techniques, several sufficient conditions are given to ensure the dissipativity of neural networks with both variable and unbounded delays. The results extend and improve the earlier publication. An example is given to show the effectiveness of the obtained results.

Introduction

In the recent years, various neural networks models have been extensively investigated and successfully applied to signal processing, pattern recognition, associative memory and optimization problems. In such applications, it is of prime importance to ensure that the designed neural networks be stable. In hardware implementation, time delays occur due to finite switching speed of the amplifiers and communication time. The time delays may lead to oscillation, divergence, or instability, which may be harmful to a system [1]. On the other hand, it has also been shown that the process of moving images requires the introduction of delay in the signal transmitted through the networks [2]. Therefore, the study of stability of neural networks with delays is practically required, and it has been extensively studied. Usually, constant fixed time delays in the models of delayed feedback systems serve as good approximation in simple circuits having a small number of cells. Though delays arise frequently in practical applications, it is difficult to measure them precisely. In most situations, delays are variable, and in fact unbounded. That is, the entire history affects the present [8]. Such delay terms, more suitable to practical neural nets, are called unbounded delays. Therefore, the studies of neural networks with time-varying delays and unbounded time delays are more important and actual than those with constant delays [13], and the stability of neural networks with variable and/or unbounded delays has received much attention in the literature [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17], [18], [19], [20], [22], [23], [24], [25], [26], [27], [28], [29], [30]. As pointed out in [21], the global dissipativity is also an important concept in dynamical neural networks. The concept of global dissipativity in dynamical systems is a more general concept and it has found applications in the areas such as stability theory, chaos and synchronization theory, system norm estimation, and robust control [21]. The global dissipativity of several classes of neural networks were discussed, and some sufficient conditions for the global dissipativity of neural networks with constant delays are derived in [21]. To the best of our knowledge, few authors have considered the global dissipativity of neural networks with both variable and unbounded delays. In this paper, we obtain some sufficient conditions ensuring the global dissipativity of neural networks with both variable and unbounded delays, and compare the results with those presented in [21].

Section snippets

Preliminaries

In this paper, we consider the neural network model with both variable and unbounded delays [13]dxi(t)dt=-dixi(t)+j=1naijgj(xj(t))+j=1nbijgj(xj(t-τij(t)))+j=1ncij-tKij(t-s)gj(xj(s))ds+uifor i = 1, 2,  , n, where n denotes the number of the neurons in the neural networks; xi(t) is the state of the ith neuron at time t; g(x(t)) = (g1(x1(t)), g2(x2(t)),  , gn(xn(t)))T is a vector-valued activation function; A = (aij)n×n, B = (bij)n×n and C = (cij)n×n are connection matrices; U = (u1, u2,  , un)T is the constant

Main results

In this section, we present the following results:

Theorem 1

Let g(x)B, then neural network model (1) is a dissipative system and the set S = S1  S2 is a positive invariant and globally attractive set, whereS1=xi=1ndi|xi|-12dij=1n(|aij|+|bij|+|cij|)kj+|ui|2i=1n14dij=1n(|aij|+|bij|+|cij|)kj+|ui|2,i=1,2,,n,S2=x|xi|1dij=1n(|aij|+|bij|+|cij|)kj+|ui|.

Proof

First, we employ a radically unbounded and positive definite Lyapunov function asV(x)=i=1n12xi2.

Computing dVdt along the positive half trajectory of (1),

Comparisons

Some famous neural network models become a special case of model (1). For example, if di = 1, gi(θ)=12(|θ+1|-|θ-1|), i = 1, 2,  , n, the model (1) has been studied by Zhang [8], some criteria of stability were given; if cij = 0, τij(t) = τij (τij is constant), i, j = 1, 2,  , n, the dissipativity of model (1) has been studied by Liao and Wang [21], some criteria of dissipativity were given. But the following example show that the dissipativity can be not determined by Theorem 3 of [21], it can be determined by

Conclusions

In this paper, the global dissativity is investigated for neural networks with both variable and unbounded delays. Five theorems are presented to characterize global dissipation and global exponential dissipation together with their sets of attraction. The theorems herein imply that the equilibrium of a neural network lies in the positive invariant and globally attractive set only, the globally asymptotic stability is equivalent to the asymptotic stability in the attractive set. Therefore, our

Acknowledgments

This work was jointly supported by the National Natural Science Foundation of China under Grant 10475026, the key project, Ministry of Education of China under Grant 03051, the Natural Science Foundation of Zhejiang Province, China under Grant M103087, the Department of Education of Zhejiang Province under Grant 20020304, the Natural Science Foundation of Huzhou city, Zhejiang Province, China, under Grant 2004SZX0703.

References (30)

Cited by (93)

View all citing articles on Scopus
View full text