Skip to main content

Supervised Classifier Combination through Generalized Additive Multi-model

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1857))

Abstract

In the framework of supervised classification and prediction modeling, this paper introduces a methodology based on a general formulation of combined model integration in order to improve the fit to the data. Despite of Generalized Additive Models (GAM) our approach combines not only and not necessarily estimations derived from smoothing functions, but also those provided by either parametric or nonparametric models. Because of the multiple classifier combination we have named this general class of models as Generalized Additive MultiModels (GAM-M). The estimation procedure iterates the inner algorithm which is a variant of the backfitting algorithm - and the outer algorithm which is a standard local scoring algorithm - until convergence. The performances of GAM-M approach with respect to alternative approaches are shown in some applications using real data sets. The stability of the model estimates is evaluated by means of bootstrap and cross-validation. As a result, our methodology improves the goodness-of-fit of the model to the data providing also stable estimates.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Antoch, J., Mola, F.: Parsimonious regressograms for generalized additive models. In N.C. Lauro (Ed.): Sviluppi Metodologici Applicativi dell’Inferenza Computazionale nell’Analisi Multidimensionale dei Dati, (1996), Rocco Curto, Napoli, 46–59.

    Google Scholar 

  2. Breiman, L.: Bagging Predictors. Machine Learning, 26, (1996), 46–59.

    Google Scholar 

  3. Breiman, L., Friedman, J.H., Olshen, R.A. and Stone, C.J.: Classification and Regression Trees, Belmont C.A. Wadsworth, (1984).

    MATH  Google Scholar 

  4. Cleveland, W.S.: Robust locally weighted regression and smoothing scatterplots, Journal of the American Statistical Association, 74, (1979), 829–836.

    Article  MATH  MathSciNet  Google Scholar 

  5. Conversano, C.: A Regression Tree Procedure for Smoothing in Generalized Additive Models. In M. Huskova et al. (eds.): Prague Stochastics’98 Abstracts, 13–14, Union of Czech Mathematicians and Physicists, (1998).

    Google Scholar 

  6. Conversano, C.: Semiparametric Models for Supervised Classification and Prediction. Some Proposals for Model Integration and Estimation Procedures (in Italian), Ph.D Thesis in Computational Statistics and Data Analysis, Universitá di Napoli Federico II, (2000).

    Google Scholar 

  7. Conversano, C., Siciliano, R.: Modelling MIB30 Index Volatility: Regression Tree Criteria for Smoothing and Variable Selection in Generalized Additive Models, submitted.

    Google Scholar 

  8. Friedman, J.H.: Multivariate Adaptive Regression Splines. The Annals of Statistics, 23, (1991), 1–149.

    Article  Google Scholar 

  9. Hastie, T.J., and Tibshirani, R.J.: Generalized Additive Models. Chapman and Hall, London, (1990).

    MATH  Google Scholar 

  10. Mardia, K.V., Kent, J.T., Bibby, J.M.: Multivariate Analysis. Academic Press, London, (1995).

    Google Scholar 

  11. Mola, F.: Selection of Cut Points in Generalized Additive Models. In: Classification and Data Analysis: Theory and Application, Springer Verlag, Berlin, (1998), 121–128.

    Google Scholar 

  12. Mola, F., Siciliano, R.: A Fast Splitting Procedure for Classification Trees, Statistics and Computing, 7, (1997), 208–216.

    Article  Google Scholar 

  13. Siciliano, R.: Exploratory versus Decision Trees, in R. Payne, P. Green (Eds.): Proceedings in Computational Statistics: COMPSTAT’ 98, Physica-Verlag, Heidelberg (D), (1998), 113–124.

    Google Scholar 

  14. Siciliano, R., Mola, F: Multivariate Data Analysis through Classification and Regression Trees, Computational Statistics and Data Analysis, 32, Elsevier Science, (2000), 285–301.

    Article  MATH  MathSciNet  Google Scholar 

  15. Silvermann B.W.: Some aspects of the spline smoothing approach to non-parametric regression curve fitting, Journal of the Royal Statistical Society, B 47, (1985) 1–52.

    Google Scholar 

  16. Yee, T.W., Wild, C.J.: Vector generalized additive models, Journal of Royal Statistical Society, B 58, (1996), 481–493.

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Conversano, C., Siciliano, R., Mola, F. (2000). Supervised Classifier Combination through Generalized Additive Multi-model. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_16

Download citation

  • DOI: https://doi.org/10.1007/3-540-45014-9_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67704-8

  • Online ISBN: 978-3-540-45014-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics