Elsevier

ISA Transactions

Volume 59, November 2015, Pages 133-148
ISA Transactions

Research Article
New passivity criteria for memristive uncertain neural networks with leakage and time-varying delays

https://doi.org/10.1016/j.isatra.2015.09.008Get rights and content

Highlights

  • The studied system contains not only leakage delay but also parameter uncertainties.

  • Wirtinger-based double integral inequality is well applied in the studied system.

  • A triple quadratic integral is introduced and the restriction on some matrices is relaxed.

  • The differentiability of delays and the monotony of the activation functions are not required.

  • Finally, the obtained sufficient conditions require neither the differentiability of time-varying delays nor the boundedness or the monotony of the activation functions.

Abstract

In this paper, the problem of passivity analysis is studied for memristor-based uncertain neural networks with leakage and time-varying delays. By combining differential inclusions with set-valued maps, the system of memristive neural networks is changed into the conventional one. By adding a triple quadratic integral and relaxing the requirement for the positive definiteness of some matrices, a proper Lyapunov–Krasovskii functional is constructed. Based on the establishment of the novel Lyapunov–Krasovskii functional, the new passivity criteria are derived by mainly applying Wirtinger-based double integral inequality, S-procedure and so on. Moreover, the conservatism of passivity conditions can be reduced. Finally, four numerical examples are given to show the effectiveness and less conservatism of the proposed criteria.

Introduction

It is well known that the neural networks are so important that they have been widely applied in various areas such as signal processing, pattern recognition, optimization problems and so on. For the recent application, it mainly concentrates upon promoting the stability conditions of the neural networks, because time delays cannot be eliminated and they often bring about instability. Thus, in order to apply neural networks with high quality, stability analysis develops more and more significantly. Many relevant results have been reported in the literature [1], [2], [3], [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15]. Meanwhile, many researchers have investigated many dynamic behaviors which are closely related to stability such as H control [25], [37], [38], state estimation [29], [32], [33], [36] and so on.

On one hand, as mentioned in [10], it is recognized that most of the recent results for studying the dynamic systems are constructed on the basis of Lyapunov–Krasovskii functional method. Based on the theory of Lyapunov–Krasovskii functional method, the conservatism of the results is one key factor to judge whether delay-dependent criteria are better or not. To derive less conservative results, one should know the reason why the conservatism of the Lyapunov–Krasovskii functional method exists. Generally speaking, there are two main indexes: one index is the construction of Lyapunov–Krasovskii functional and the other is the bound on its derivative. Therefore, it is so crucial for constructing a proper Lyapunov–Krasovskii functional as to obtain the less conservative criteria. In order to establish a suitable Lyapunov–Krasovskii functional, there are two suggested ways: one is to expand the Lyapunov–Krasovskii functional by adding a delay term to the state vector and the other is to construct Lyapunov–Krasovskii functional by partitioning the delay into many segments. Besides, it is necessary to manipulate the derivative of Lyapunov–Krasovskii functional to obtain the conservative criteria. While taking the derivative of Lyapunov–Krasovskii functional, the bounds of the integrals emerge in the derivative. Two main techniques are usually used for solving such integrals: the integral inequality method and the free-weighting method. Usually, free-weighting matrices can mainly be added via the introduction of zero inequality and S-procedure. Very recently, there is a new integral inequality named Wirtinger-based double integral inequality which can be applied in obtaining less conservative criteria in comparison with the Jensen inequality. How to construct a proper Lyapunov–Krasovskii functional by adding some delay terms to the state vector such as a triple quadratic integral is still an open question. Meanwhile, how to obtain less conservative results by effectively applying S-procedure and Wirtinger-based double integral inequality in the novel Lyapunov–Krasovskii functional is still an interesting challenge.

On the other hand, it is well-known that the passivity theory plays an important role in the analysis of the stability of dynamical systems, nonlinear control and other areas. The characteristic of passive properties is that they can make the systems internally stable. During the past several years, the passivity problem for time-delay systems has been investigated in the literature [16], [17], [18], [19], [20], [21], [22]. In [16], [17], [18], [19], [20], [21], [22], authors investigated the passivity of neural networks with time-varying delays and gave the corresponding criteria for checking the passivity. It is worth noting that the passivity conditions in [16], [17], [18], [19], [20], [21], [22] were derived on the basis of quadratic Lyapunov–Krasovskii functional in which the involved symmetric matrix was always assumed to be positive. However, the results in [16], [17], [18], [19], [20], [21], [22] may still be conservative and not relaxed enough to be applied. Thus, it is necessary to investigate whether the results in [16], [17], [18], [19], [20], [21], [22] could be improved by constructing relaxed conditions. Recently, in [23], authors revisited the problem of passivity analysis for neural networks by giving relaxed passivity conditions.

However, it is worth pointing out that the given criteria in [16], [17], [18], [19], [20], [21], [22], [23] have been based on the following assumptions: firstly, the time-varying delays are continuously differentiable; secondly, the derivative of time-varying delay is bounded and is smaller than one; thirdly, the activation functions are monotonically nondecreasing. Actually, time delays can occur in an irregular fashion and sometimes the time-varying delays are not differentiable. So those above assumptions are not milder enough to be extensively applied. In addition, recently, many researchers have paid close attention to a state-dependent switching system named memristor-based neural networks whose connection weights vary according to the changes of their state (see [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50]). Particularly, many dynamic behaviors such as stability [50], synchronization [42], [43], [46], passivity [44], [45], [49], state estimation [47], dissipativity [48] and so on have been studied by many scholars. However, the above-mentioned memristive neural networks rarely involved problem caused by both parameter uncertainties and leakage delay. It is well-known that the parameter uncertainties are inherent features of many physical systems and may cause the instability and poor performance. These uncertainties may arise because of the variations in system parameters, modeling errors or some ignored factors [24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35]. Moreover, since time delay in the stabilizing negative feedback term has a tendency to destabilize a system [24], [26], [27], [32], [33], [34], [35], the leakage delay also plays a great impact on the dynamics of neural networks. Therefore, it is necessary to study the passivity problem for memristor-based neural networks with parameter uncertainties and leakage delay.

Motivated by the above discussions, the relaxed passivity conditions are adopted for memristive uncertain neural networks with leakage and time-varying delays in this paper. The main contribution of this paper lies in the following five aspects: firstly, the studied memristive neural networks contain not only leakage delay but also parameter uncertainties; secondly, the new Lyapunov–Krasovskii functional is constructed by introducing a triple quadratic integral and relaxing the restriction on the positive definiteness of some matrices; thirdly, to solve the time-derivative of the triple integral, Wirtinger-based double integral inequality is well applied in researching the lower bound of the corresponding double integral; fourthly, to relax the revelent conditions about the positive definiteness of some matrices, Jensen inequality is fully combined with Schur complement theory to prove that the corresponding Lyapunov–Krasovskii functional V(t) must be positive definite; finally, the obtained sufficient conditions require neither the differentiability of time-varying delays nor the bound or the monotony of the activation functions. Four examples are given to show the effectiveness and less conservatism of the proposed criteria in the end.

Notations: Throughout this paper, the superscripts ‘−1’ and ‘T’ stand for the inverse and transpose of a matrix, respectively. [·,·] represents the interval. Q>0(Q0,Q<0,Q0) means that the matrix Q is symmetric positive definite (positive-semi definite, negative definite and negative-semi definite). · refers to the Euclidean vector norm. Rn denotes n-dimensional Euclidean space. C([ρ,0],Rn) represents Banach space of all continuous functions. Rm×n is the set of m×n real matrices. ⁎ denotes the symmetric block in symmetric matrix. For matrices S=(sij)m×n, T=(tij)m×n, ST(ST) means that sijtij(sijtij), for i=1,2,,m,j=1,2,,n. And by the interval matrix [S,T], it follows that ST. For V=(vij)m×n[S,T], it means SVT, i.e, sijvijtij for i=1,2,,m,j=1,2,,n. co{Π1,Π2} denotes the closure of the convex hull generated by real numbers Π1 and Π2. Let c¯i=max{ċi,c¨i}, c̲i=min{ċi,c¨i}, a¯ij=max{ȧij,a¨ij}, a̲ij=min{ȧij,a¨ij}, b¯ij=max{ḃij,b¨ij}, b̲ij=min{ḃij,b¨ij}. Matrix dimensions, if not explicitly stated, are assumed to be compatible for algebraic operations.

Section snippets

Problem statement and preliminaries

In this section, a general class of memristive neural networks is introduced as follows:ẋi(t)=c˜i(xi(t))xi(tδ)+j=1na˜ij(xi(t))fj(xj(t))+j=1nb˜ij(xi(t))fj(xj(tτj(t)))+ui(t),t0,i=1,2,,n,yi(t)=fi(xi(t)),t0,i=1,2,,n,xi(t)=ϕi(t),t[ρ,0],ρ=max{δ,τ},where xi(t) stands for the neuron state vector of the system, fi(xi(t))Rn and fi(xi(tτi(t)))Rn are the nonlinear activation function without and with time-varying delay, respectively, ui(t) is the input vector, yi(t)Rn is the output vector

Main results

In this paper, the new stability criteria and passivity conditions are investigated for memristive uncertain neural networks with both leakage and time-varying delays by constructing a novel quadratic functional. For the sake of simplicity of matrix representation, ei(i=1,,12) is defined as block entry matrices. (For example, e1T=[I,0,0,0,0,0,0,0,0,0,0,0].) The notations for some matrices are defined in the Appendix.

Numerical examples

In this section, four examples are presented to demonstrate the effectiveness of the corresponding results as the following.

Example 4.1

Consider a three-neuron memristive neural networks (49):ẋ1(t)=c˘1(x1(t))x1(tδ)+a˘11(x1(t))f(x1(t))+a˘12(x1(t))f(x2(t))+a˘13(x1(t))f(x3(t))+b˘11(x1(t))f(x1(tτ(t)))+b˘12(x1(t))f(x2(tτ(t)))+b˘13(x1(t))f(x3(tτ(t)))+u1(t),ẋ2(t)=c˘2(x2(t))x2(tδ)+a˘21(x2(t))f(x1(t))+a˘22(x2(t))f(x2(t))+a˘23(x2(t))f(x3(t))+b˘21(x2(t))f(x1(tτ(t)))+b˘22(x2(t))f(x2(tτ(t)))+b˘23(x2(t))f(x3(

Conclusions

In this paper, the problem of passivity has been investigated for memristive uncertain neural networks with leakage and time-varying delays. The more relaxed conditions are obtained on basis of the novel Lyapunov–Krasovskii functional especially including a triple integral, relaxing the restriction about the positive definiteness of some matrices and employing wilder activation function conditions. The corresponding simulations of four numerical examples are given to demonstrate the usefulness

References (50)

  • K. Mathiyalagan et al.

    Robust mixed H and passive filtering for networked Markov jump systems with impulses

    Signal Process

    (2014)
  • H. Li et al.

    Passivity criteria for continuous-time neural networks with mixed time-varying delays

    Appl Math Comput

    (2012)
  • B. Chen et al.

    Passivity analysis for uncertain neural networks with discrete and distributed time-varying delays

    Phys Lett A

    (2009)
  • R. Sakthivel et al.

    Design of state estimator for bidirectional associative memory neural networks with leakage delays

    Inf Sci

    (2015)
  • S. Zhu et al.

    Exponential passivity of neural networks with time-varying delay and uncertainty

    Phys Lett A

    (2010)
  • O.M. Kwon et al.

    New delay-dependent robust stability criterion for uncertain neural networks with time-varying delays

    Appl Math Comput

    (2008)
  • P. Balasubramaniam et al.

    State estimation for fuzzy cellular neural networks with time delay in the leakage term, discrete and unbounded distributed delays

    Comput Math Appl

    (2011)
  • S. Lakshmanan et al.

    Design of state estimator for neural networks with leakage, discrete and distributed delays

    Appl Math Comput

    (2012)
  • Z.J. Zhao et al.

    Passivity analysis of stochastic neural networks with time-varying delays and leakage delay

    Neurocomputing

    (2014)
  • J.J. Ren et al.

    State estimation for neural networks with multiple time delays

    Neurocomputing

    (2015)
  • A. Arunkumar et al.

    Robust reliable H control for stochastic neural networks with randomly occurring delays

    Neurocomputing

    (2015)
  • J. Cheng et al.

    Finite-time H estimation for discrete-time Markov jump systems with time-varying transition probabilities subject to average dwell time switching

    Commun Nonlinear Sci Numer Simul

    (2015)
  • A.L. Wu et al.

    Exponential synchronization of memristor-based recurrent neural networks with time delays

    Neurocomputing

    (2011)
  • M.H. Jiang et al.

    Finite-time synchronization control of a class of memristor-based recurrent neural networks

    Neural Netw

    (2015)
  • S. Wen et al.

    Passivity analysis of memristor-based recurrent neural networks with time-varying delays

    J Frankl Inst

    (2013)
  • Cited by (42)

    • Fuzzy integral sliding mode technique for synchronization of memristive neural networks

      2021, Mem-elements for Neuromorphic Circuits with Artificial Intelligence Applications
    • Passivity and passification of memristive recurrent neural networks with multi-proportional delays and impulse

      2020, Applied Mathematics and Computation
      Citation Excerpt :

      Therefore, how to maintain the stability under the condition of various delays has been researched widely recently. Some work has been carried out to study MRNNs with leakage delays [45,46], distributed delays [47–49], and time-varying delays [50–56]. Wang et al. [57] have studied passivity and passification of MRNNs subject to time-varying and leakage delays.

    • Pinning control for passivity and synchronization of coupled memristive reaction–diffusion neural networks with time-varying delay

      2020, Neurocomputing
      Citation Excerpt :

      Besides, another extremely important dynamical behavior of NNs is passivity, treated as a useful tool to study the stability and synchronization of NNs. In the last decade, remarkable progresses have been made in the investigation of passivity for MNNs [21–25]. Authors in [21] converted the MNNs into the conventional NNs through combining set-valued maps with differential inclusions, and the new passivity criteria were acquired.

    View all citing articles on Scopus

    This work was supported by National Natural Science Foundation of China (61273015).

    View full text