Skip to main content
Log in

Evolving Takagi–Sugeno model based on online Gustafson-Kessel algorithm and kernel recursive least square method

  • Original Paper
  • Published:
Evolving Systems Aims and scope Submit manuscript

Abstract

In this paper, we introduce an evolving system utilizing sparse weighted kernel least square as local models and online Gustafson-Kessel clustering algorithm for structure identification. Our proposed online clustering algorithm forms elliptical clusters with any orientation which leads to creating less but more complex shape clusters than spherical ones. Moreover, the clustering algorithm is able to determine number of required clusters by adding new clusters over time and to reduce the redundancy of model by merging similar clusters. Additionally, we propose weighted kernel recursive least square method with a new sparsification procedure based on instant prediction error. Also, we introduce an adaptive gradient-based rule for tuning kernel size. The sparsification procedure and adaptive kernel size improve the performance of kernel recursive least square, significantly. To illustrate our methodology, we apply the introduced model to online identification of a time varying and nonlinear system. Finally, to show the superiority of our approach in comparison to some known online approaches, two different time series are considered: Mackey–Glass as a benchmark and electrical load as a real-world time series.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Abbreviations

N :

Dictionary capacity

n :

# of features/regressors

M :

# of clusters/fuzzy rules

m :

Fuzzification degree

c i j :

ith stored data in jth local dictionary

a i j :

ith consequence parameter of jth dictionary

κ(.,.):

Kernel function

xy :

Input–output pair

Q :

# of input–output pairs

\(\hat{y}_{j}\) :

Estimated output by jth fuzzy rule

\(\hat{y}\) :

Total estimated output

Ψ j (x):

Fuzzy membership of x in jth fuzzy rule

x k :

The kth input vector

y k :

The kth output value

μ k j :

Cluster center of jth cluster at step k

A k j :

Norm inducing matrix of jth cluster at step k

F k j :

Fuzzy covariance matrix of jth cluster at step k

ρ j :

Cluster volume parameter of jth cluster

N k j :

Membership sum of jth cluster at step k

S j (x k):

Similarity value of kth input and jth cluster

η 1 :

Thresholds value for creating new cluster

sim pq :

Similarity value of pth and qth clusters

η 2 :

Thresholds value for merging two clusters

λ :

Regularization parameter

ω(k):

Weight vector in feature space

\(\varvec{\varphi }(k)\) :

Feature vector of x k

G(k):

Kernel/gram matrix

Y(k):

Output vector

U(k):

Weight/membership matrix

Φ(k):

Feature matrix

a(k):

Consequence parameter at step k

g(k):

Kernel vector

\({\mathcal{D}}_{j}\) :

jth local dictionary

Qz j r j :

Slack variables

e k j :

Prediction error of jth fuzzy rule at step k

\({\tilde{\mathbf{Q}}}_{j} \left( k \right)\) :

Reduced matrix of Q(k)

C j (k):

The set of all stored input data in \({\mathcal{D}}_{j}\) at step k

Y j (k):

The set of all stored output data in \({\mathcal{D}}_{j}\) at step k

σ k j :

Kernel size of jth fuzzy rule at step k

P k j :

Hessian matrix

J k j :

Gradient vector

References

  • Angelov PP, Filev DP (2004) An approach to online identification of Takagi–Sugeno fuzzy models. IEEE Trans Syst Man Cybern B 34(1):484–498

    Article  Google Scholar 

  • Angelov P, Filev D (2005) Simpl_eTS: a simplified method for learning evolving Takagi-Sugeno fuzzy models. In: Proceedings of IEEE international conference on fuzzy systems, pp 1068–1073

  • Angelov P, Zhou X (2006) Evolving fuzzy systems from data streams in real-time. In: Proceedings of IEEE international symposium on evolving fuzzy systems, pp 29–35

  • Angelov P, Zhou X (2008) On line learning fuzzy rule-based system structure from data streams. In: Proceedings of IEEE international conference on fuzzy systems, pp 915–922

  • Angelov P, Giglio V, Guardiola C, Lughofer E, Luján JM (2006) An approach to model-based fault detection in industrial measurement systems with application to engine test benches. Meas Sci Technol 17(7):1809

    Article  Google Scholar 

  • Baruah RD, Angelov P (2011) Evolving fuzzy systems for data streams: a survey. Wiley Interdiscip Rev: Data Min Knowl Disc 1(6):461–476

    Google Scholar 

  • Bezdek JC, Ehrlich R, Full W (1984) FCM: the fuzzy c-means clustering algorithm. Comput Geosci 10(2):191–203

    Article  Google Scholar 

  • Bittanti S, Piroddi L (1997) Nonlinear identification and control of a heat exchanger: a neural network approach. J Franklin Inst 334(1):135–153

    Article  Google Scholar 

  • Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  • DaISy (2014) Database for the identification of systems. http://www.esat.kuleuven.be/sista/daisy/

  • Dovžan D, Škrjanc I (2011) Recursive clustering based on a Gustafson-Kessel algorithm. Evol Syst 2(1):15–24

    Article  Google Scholar 

  • Du H, Zhang N (2008) Application of evolving Takagi–Sugeno fuzzy model to nonlinear system identification. Appl Soft Comput 8(1):676–686

    Article  Google Scholar 

  • Engel Y, Mannor S, Meir R (2004) The kernel recursive least-squares algorithm. IEEE Trans Signal Process 52(8):2275–2285

    Article  MathSciNet  Google Scholar 

  • Georgieva O, Filev D (2009) Gustafson-Kessel algorithm for evolving data stream clustering. In: Proceedings of the international conference on computer systems and technologies, pp 62:66

  • Gustafson DE, Kessel WC (1978) Fuzzy clustering with a fuzzy covariance matrix. In: Proceedings of IEEE conference on decision and control, pp 761–766

  • Kalhor A, Araabi BN, Lucas C (2010) An online predictor model as adaptive habitually linear and transiently nonlinear model. Evol Syst 1(1):29–41

    Article  Google Scholar 

  • Kalhor A, Araabi BN, Lucas C (2012) A new systematic design for habitually linear evolving TS fuzzy model. Expert Syst Appl 39(2):1725–1736

    Article  Google Scholar 

  • Kalhor A, Araabi BN, Lucas C (2013) Evolving Takagi–Sugeno fuzzy model based on switching to neighboring models. Appl Soft Comput 13(2):939–946

    Article  Google Scholar 

  • Kasabov NK, Song Q (2002) DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans Fuzzy Syst 10(2):144–154

    Article  Google Scholar 

  • Komijani M, Lucas C, Araabi BN, Kalhor A (2012) Introducing evolving Takagi–Sugeno method based on local least squares support vector machine models. Evol Syst 3(2):81–93

    Article  Google Scholar 

  • Liu W, Park IM, Wang Y, Principe JC (2009) Extended kernel recursive least squares algorithm. IEEE Trans Signal Process 57(10):3801–3814

    Article  MathSciNet  Google Scholar 

  • Liu W, Principe JC, Haykin S (2010) Kernel adaptive filtering: a comprehensive introduction. Wiley, Hoboken

    Book  Google Scholar 

  • Lughofer ED (2008) FLEXFIS: a robust incremental learning approach for evolving Takagi–Sugeno fuzzy models. IEEE Trans Fuzzy Syst 16(6):1393–1410

    Article  Google Scholar 

  • Lughofer E (2011) Evolving fuzzy systems—methodologies, advanced concepts and applications. Springer, Heidelberg

    Book  MATH  Google Scholar 

  • Lughofer E, Macián V, Guardiola C, Klement EP (2010) Data-driven design of Takagi–Sugeno fuzzy systems for predicting NOx emissions. In: Information processing and management of uncertainty in knowledge-based systems. Applications. Springer, Berlin, Heidelberg, pp 1–10

    Chapter  Google Scholar 

  • Muñoz A, Sánchez-Úbeda E, Cruz A, Marín J (2010) Short-term forecasting in power systems: a guided tour. In: Handbook of power systems II. Springer, Berlin, pp 129–160

    Chapter  Google Scholar 

  • Ngia LSH, Sjoberg J, Viberg M (1998) Adaptive neural nets filter using a recursive Levenberg-Marquardt search direction. In: Proceedings of the 32nd asilomar conference on signals, systems and computers, pp 697–701

  • Petersen KB, Pedersen MS (2008) The matrix cookbook. Technical University of Denmark

  • PJM (2014) PJM—historical metered load data. http://www.pjm.com/markets-and-operations/ops-analysis/historical-load-data.aspx

  • Sannen D, Nuttin M, Smith J, Tahir MA, Caleb-Solly P, Lughofer E, Eitzinger C (2008) An on-line interactive self-adaptive image classification framework. In: Computer vision systems. Springer, Berlin, pp 171–180

    Chapter  Google Scholar 

  • Scholkopf B, Smola AJ (2001) Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, Cambridge

    Google Scholar 

  • Soleimani-B H, Lucas C, Araabi BN (2010) Recursive Gath-Geva clustering as a basis for evolving neuro-fuzzy modeling. Evol Syst 1(1):59–71

    Article  Google Scholar 

  • Soleimani-B H, Lucas C, Araabi B (2012) Fast evolving neuro-fuzzy model and its application in online classification and time series prediction. Pattern Anal Appl 15(3):279–288

    Article  MathSciNet  Google Scholar 

  • Suykens JAK, Vandewalle J, De Moor B (2001) Optimal control by least squares support vector machines. Neural Netw 14(1):23–35

    Article  Google Scholar 

  • Takagi T, Sugeno M (1985) Fuzzy identification of systems and its applications to modeling and control. IEEE Trans Syst Man Cybern 15:116–132

    Article  MATH  Google Scholar 

  • Tipping ME (2001) Sparse Bayesian learning and the relevance vector machine. J Mach Learn Res 1:211–244

    MathSciNet  MATH  Google Scholar 

  • Van Vaerenbergh S, Via J, Santamaria I (2006) A sliding-window Kernel RLS algorithm and its application to nonlinear channel identification. In: Proceedings of IEEE international conference on acoustics, speech and signal processing, pp 789–792

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Soroosh Shafieezadeh-Abadeh.

Appendix A

Appendix A

According to recursive Gauss–Newton algorithm, \({\mathbf{J}}_{j}^{k + 1} \, = \, - \frac{{\partial e^{k + 1} }}{{\partial \sigma_{j}^{k} }}\) can be written as following:

$${\mathbf{J}}_{j}^{k + 1} \, = \,\frac{{\partial \hat{y}_{j}^{k + 1} }}{{\partial \sigma_{j}^{k} }}\varPsi_{j} \left( {{\mathbf{x}}^{k + 1} } \right)\, = \,{\mathbf{a}}_{j}^{T} \left( k \right)\frac{{\partial {\mathbf{g}}_{j} \left( {k + 1} \right)}}{{\partial \sigma_{j}^{k} }}\varPsi_{j} \left( {{\mathbf{x}}^{k + 1} } \right)$$
(A.1)

where \(\frac{{\partial {\mathbf{g}}_{j} \left( {k + 1} \right)}}{{\partial \sigma_{j}^{k} }}\) equals to:

$$\frac{{\partial {\mathbf{g}}_{j} \left( {k + 1} \right)}}{{\partial \sigma_{j}^{k} }}\, = \,\left[ {\begin{array}{*{20}c} {\frac{{\left| {\left| {{\mathbf{x}}^{k + 1} - {\mathbf{c}}_{j}^{1} } \right|} \right|^{2} }}{{\left( {\sigma_{j}^{k} } \right)^{3} }}\kappa \left( {{\mathbf{x}}^{k + 1} ,{\mathbf{c}}_{j}^{1} } \right)} \\ \vdots \\ {\frac{{\left| {\left| {{\mathbf{x}}^{k + 1} - {\mathbf{c}}_{j}^{N} } \right|} \right|^{2} }}{{\left( {\sigma_{j}^{k} } \right)^{3} }}\kappa \left( {{\mathbf{x}}^{k + 1} ,{\mathbf{c}}_{j}^{N} } \right)} \\ \end{array} } \right]$$
(A.2)

So by considering all these, we can conclude (47).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shafieezadeh-Abadeh, S., Kalhor, A. Evolving Takagi–Sugeno model based on online Gustafson-Kessel algorithm and kernel recursive least square method. Evolving Systems 7, 1–14 (2016). https://doi.org/10.1007/s12530-015-9129-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12530-015-9129-1

Keywords

Navigation