Skip to main content
Log in

Evolving fuzzy optimally pruned extreme learning machine for regression problems

  • Original Paper
  • Published:
Evolving Systems Aims and scope Submit manuscript

Abstract

This paper proposes an approach to the identification of evolving fuzzy Takagi–Sugeno systems based on the optimally pruned extreme learning machine (OP-ELM) methodology. First, we describe ELM, a simple yet accurate learning algorithm for training single-hidden layer feed-forward artificial neural networks with random hidden neurons. We then describe the OP-ELM methodology for building ELM models in a robust and simplified manner suitable for evolving approaches. Based on the previously proposed ELM method, and the OP-ELM methodology, we propose an identification method for self-developing or evolving neuro-fuzzy systems applicable to regression problems. This method, evolving fuzzy optimally pruned extreme learning machine (eF-OP-ELM), follows a random projection based approach to extracting evolving fuzzy rulebases. In this approach systems are not only evolving but their structure is defined on the basis of randomly generated fuzzy basis functions. A comparative analysis of eF-OP-ELM is performed over a diverse collection of benchmark datasets against well known evolving neuro-fuzzy methods, namely eTS and DENFIS. Results show that the method proposed yields compact rulebases, is robust and competitive in terms of accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. By default the initial number of neurons used in the OP-ELM Toolbox (Lendasse et al. 2010; Miche et al. 2008) is 100.

  2. The series is available online from http://www.ngdc.noaa.gov/stp/SOLAR/. The International Sunspot Number is produced by the Solar Influence Data Analysis Center (SIDC) at the Royal Observatory of Belgium (Van der Linden and the SIDC Team 2008).

References

  • Achlioptas D (2003) Database-friendly random projections: Johnson-Lindenstrauss with binary coins. J Comput Syst Sci 66(4):671–687

    Article  MATH  MathSciNet  Google Scholar 

  • Angelov P, Filev D (2004) Flexible models with evolving structure. Int J Intell Syst 19(4):327–340

    Article  MATH  Google Scholar 

  • Angelov P, Filev D, Kasabov N (2008) Evolving fuzzy systems—preface to the special section. IEEE Trans Fuzzy Syst 16(6):1390–1392

    Article  Google Scholar 

  • Angelov P, Filev D, Kasabov N (eds) (2010) Evolving intelligent systems. Methodoloy and applications. Wiley/IEEE Press

  • Angelov PP, Filev DP (2004) An approach to online identification of Takagi–Sugeno fuzzy models. IEEE Trans Syst Man Cybern B 34(1):484–498

    Article  Google Scholar 

  • Angelov PP, Filev DP (2005) Simpl_eTS: a simplified method for learning evolving Takagi–Sugeno fuzzy models. In: Proceedings of the IEEE International Conference on fuzzy systems, Reno, NV, USA, pp 1068–1073

  • Asuncion A, Newman DJ (2010) UCI machine learning repository. University of California, Irvine, Center for Machine Learning and Intelligent Systems. http://archive.ics.uci.edu/ml/

  • Birattari M, Bontempi G, Bersini H (1999) Lazy learning meets the recursive least squares algorithm. In: Advances in Neural Information Processing Systems (NIPS), vol 11. MIT Press, Cambridge, pp 375–381

  • Dourado A, Aires L, Victor J (2009) eFSLab: Developing evolving fuzzy systems from data in a friendly environment. In: Proc 10th Eur Control Conf, Prague, Czech Republic, pp 922–927

  • Efron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Ann Stat 32(2):407–499

    Article  MATH  MathSciNet  Google Scholar 

  • ESTSP07 (2010) ESTSP: European Symposium on Time Series Prediction. http://www.estsp.org

  • Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1362–1357

    Google Scholar 

  • Fradkin D, Madigan D (2003) Experiments with random projections for machine learning. In: Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining, New York, NY, USA, pp 517–522

  • Haykin SS (1998) Neural networks: a comprehensive foundation, 2nd edn. Prentice-Hall, Englewood Cliffs. ISBN: 978-0132733502

  • Higham NJ (2002) Accuracy and Stability of Numerical Algorithms, 2nd edn. Society for Industrial and Applied Mathematics, Philadelphia

  • Huang GB (2008) Reply to “comment on the extreme learning machine”. IEEE Trans Neural Netw 19(8):1495–1496

    Article  Google Scholar 

  • Huang GB, Chen L, Siew CK (2006) Universal approximaion using incremental constructuve feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892

    Article  Google Scholar 

  • Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489-5-01

    Article  Google Scholar 

  • Hurrell JW, Deser C (2009) North atlantic climate variability: the role of the north atlantic oscillation. J Mar Syst 78(1):28–41

    Article  Google Scholar 

  • Hyndman RJ (2010) Time series data library. http://www.robjhyndman.com/TSDL

  • Internet2Observatory (2008) The Internet2 Observatory. http://www.internet2.edu/observatory/

  • Kasabov N (2007) Evolving connectionist systems: the knowledge engineering approach, 2nd edn. Springer, Berlin

  • Kasabov NK, Song Q (2002) DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction. IEEE Trans Fuzzy Syst 10(2):144–154

    Article  Google Scholar 

  • Leite D, Costa P, Gomide F (2009) Interval-based evolving modeling. In: IEEE Workshop on evolving and self-developing intelligent systems, Nasville, TN, USA, pp 1–8

  • Lendasse A, Sorjamaa A, Miche Y (2010) The OP-ELM toolbox. Time Series Prediction and Chemoinformatics Group. Department of Information and Computer Science. Aalto University School of Science and Technology. http://www.cis.hut.fi/projects/tsp/index.php?page=opelm

  • Liang NY, Huang GB, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6):1411–1423

    Article  Google Scholar 

  • Lughofer ED (2008) Flexfis: A robust incremental learning approach for evolving Takagi–Sugeno fuzzy models. IEEE Trans Fuzzy Syst 16(6):1393–1410

    Article  Google Scholar 

  • Mackey MC, Glass L (1977) Oscillations and chaos in physiological control systems. Science 197(4300):287–289

    Article  Google Scholar 

  • Miche Y, Sorjamaa A, Lendasse A (2008) OP-ELM: Theory, experiments and a toolbox. In: Proceedings of the international conference on artificial neural networks. Lecture notes in computer science, vol 5163, Prague, Czech Republic, pp 145–154

  • Miche Y, Schrauwen B, Lendasse A (2010) Machine learning techniques based on random projections. In: 18th European symposium on artificial neural networks, computational intelligence and machine learning, pp 295–302

  • Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21(1):158–162

    Article  Google Scholar 

  • Mikut R, Jakel J, Groll L (2005) Interpretability issues in data-based learning of fuzzy systems. Fuzzy Sets Syst 150(2):179–197

    Article  MATH  MathSciNet  Google Scholar 

  • Montesino Pouzols F, Lendasse A, Barriga A (2008) Fuzzy Inference Based Autoregressors for Time Series Prediction Using Nonparametric Residual Variance Estimation. In: Proceedings of the IEEE International conference on fuzzy systems, Hong Kong, China, pp 613–618

  • Montesino Pouzols F, Lendasse A, Barriga A (2008) xftsp: a tool for time series prediction by means of fuzzy inference systems. In: Proceedings of the IEEE international conference on intelligent systems, Varna, Bulgaria, pp 2-2-2-7

  • Montesino Pouzols F, Lendasse A, Barriga A (2010) Autoregressive time series prediction by means of fuzzy inference systems using nonparametric residual variance estimation. Fuzzy Sets Syst 161(4):471–497

    Article  Google Scholar 

  • Moreno-Velo FJ, Baturone I, Barriga A, Sánchez-Solano S (2007) Automatic tuning of complex fuzzy systems with Xfuzzy. Fuzzy Sets Syst 158(18):2026–2038

    Article  MATH  Google Scholar 

  • Myers RH (2000) Classical and modern regression with applications, 2nd edn. Duxbury Press, North Scituate

  • Pedrycz W(2005) Knowledge-based clustering: from data to information granules. Wiley, New York

  • Platt J (1991) A resource-allocating network for function interpolation. Neural Comput 3(2):213–225

    Article  MathSciNet  Google Scholar 

  • Poggio T, Girosi F (1989) A Theory of Networks for Approximation and Learning, vol 1140. MIT Press, Cambridge

  • Ramos JV, Dourado A (2006) Pruning for interpretability of large spanned eTS. In: Proceedings of the 2006 international symposium on evolving fuzzy systems, EFS’06, IEEE Press, Ambelside, Lake District, UK, pp 55–60

  • Rong HJ, Sundararajan N, Huang GB, Saratchandran P (2006) Sequential adaptive fuzzy inference system (SAFIS) for nonlinear system identification and prediction. Fuzzy Sets Syst 157(9):1260–1275

    Article  MATH  MathSciNet  Google Scholar 

  • Rong HJ, Huang GB, Sundararajan N, Saratchandran P (2009) Online sequential fuzzy extreme learning machine for function approximation and classification problems. IEEE Trans Syst Man Cybern B 39(4):1067–1072

    Article  Google Scholar 

  • SantaFeLaser (2010) The Santa Fe Time Series Competition Data. Data Set A: Laser generated data. http://www-psych.stanford.edu/~andreas/Time-Series/SantaFe.html

  • Schölkopf B, Smola AJ (2002) Learning with kernels. Support vector machines, regularization, optimization, and beyond. MIT Press, Cambridge. ISBN: 0262194759

  • Similä T, Tikka J (2005) Multiresponse sparse regression with application to multidimensional scaling. In: Proceedings of the international conference on artificial neural networks, Warsaw, Poland, vol 3967, pp 97–102

  • Sorjamaa A, Miche Y, Weiss R, Lendasse A (2008) Long-Term Prediction of Time Series using NNE-based Projection and OP-ELM. In: Proceedings of the international joint conference on neural networks, Hong Kong, China, pp 2675–2681

  • StatLib (2010) Department of Statistics, Carnegie Mellon University. http://lib.stat.cmu.edu/datasets/

  • Suykens JAK, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore

    Book  MATH  Google Scholar 

  • Tsonis AA, Swanson K, Kravtsov S (2007) A new dynamical mechanism for major climate shifts. Geophys Res Lett 34:L13,705, 5 pp

  • Van der Linden RAM, the SIDC Team (2008) Online catalogue of the Sunspot Index. RWC Belgium, World Data Center for the Sunspot Index, Royal Observatory of Belgium, years 1748–2007, http://sidc.oma.be/html/sunspot.html

  • Weigend A, Gershenfeld N (1994) Times series prediction: forecasting the future and understanding the past. Addison-Wesley, Reading

  • Wu Z, Huang NE, Long SR, Peng CK (2007) On the trend, detrending, and variability of nonlinear and nonstationary time series. Proc Nat Acad Sci 104(38):14889–14894

    Google Scholar 

  • Yager R (2008) Measures of specificity over continuous spaces under similarity relations. Fuzzy Sets Syst 159(17):2193–2210

    Article  MATH  MathSciNet  Google Scholar 

  • Zadeh LA (1997) Toward a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic. Fuzzy Sets Syst 90(2):111–127

    Article  MATH  MathSciNet  Google Scholar 

  • Zeng XJ, Singh MG (1995) Approximation theory of fuzzy systems—MIMO case. IEEE Trans Fuzzy Syst 3(2):219–235

    Article  Google Scholar 

  • Zhou SM, Gan JQ (2008) Low-level interpretability and high-level interpretability: a unified view of datadriven interpretable fuzzy system modelling. Fuzzy Sets Syst 159(23):3091–3131

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The respective authors of the software tools used in this work (see references and links in previous sections) are acknowledged for making their software publicly available. FMP is supported by a Marie Curie Intra-European Fellowship for Career Development (grant agreement PIEF-GA-2009-237450) within the European Community’s Seventh Framework Programme (FP7/2007–2013).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Federico Montesino Pouzols.

Appendix: Incremental initialization of eF-OP-ELM

Appendix: Incremental initialization of eF-OP-ELM

This appendix discusses an incremental procedure for the initialization of ef-OP-ELM models. It was not included in previous sections for clarity’s sake and may not be required in many practical setups, as the number of samples required for batch initialization (as explained in Sect. 3) is small.

The incremental algorithm presented here can start with the first data sample and continues until the minimum initialization number of samples has been observed. More specifically, the initialization continues until the number of observations reaches M, the maximum number of antecedent parameters generated for H 0(j). The procedure consists of the following steps, which can replace steps 1, 2 and 3 in Algorithm 1:

figure b

This way, during step 2, a total of j fuzzy basis functions are available when j input-output samples have been observed. That is, H 0(j) has j rows (observations) and j columns (fuzzy basis functions), which are then subject to ranking ans selection in the next steps. In step 2.1, the procedure used to generate the centers and radii from a uniform random distribution follows the same scheme as described in Sect. 3.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pouzols, F.M., Lendasse, A. Evolving fuzzy optimally pruned extreme learning machine for regression problems. Evolving Systems 1, 43–58 (2010). https://doi.org/10.1007/s12530-010-9005-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12530-010-9005-y

Keywords

Navigation