Skip to main content

Model Selection in Kernel Methods Based on a Spectral Analysis of Label Information

  • Conference paper
  • 2211 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4174))

Abstract

We propose a novel method for addressing the model selection problem in the context of kernel methods. In contrast to existing methods which rely on hold-out testing or try to compensate for the optimism of the generalization error, our method is based on a structural analysis of the label information using the eigenstructure of the kernel matrix. In this setting, the label vector can be transformed into a representation in which the smooth information is easily discernible from the noise. This permits to estimate a cut-off dimension such that the leading coefficients in that representation contains the learnable information, discarding the noise. Based on this cut-off dimension, the regularization parameter is estimated for kernel ridge regression.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Koltchinskii, V., Giné, E.: Random matrix approximation of spectra of integral operators. Bernoulli 6(1), 113–167 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  2. Taylor, J.S., Williams, C., Cristianini, N., Kandola, J.: On the eigenspectrum of the Gram matrix and the generalization error of kernel PCA. IEEE Transactions on Information Theory 51, 2510–2522 (2005)

    Article  Google Scholar 

  3. Blanchard, G.: Statistical properties of kernel principal component analysis. Machine Learning (2006)

    Google Scholar 

  4. Koltchinskii, V.I.: Asymptotics of spectral projections of some random matrices approximating integral operators. Progress in Probability 43, 191–227 (1998)

    MathSciNet  Google Scholar 

  5. Zwald, L., Blanchard, G.: On the convergence of eigenspaces in kernel principal component analysis. In: NIPS 2005 (2005)

    Google Scholar 

  6. Braun, M.L.: Spectral Properties of the Kernel Matrix and their Application to Kernel Methods in Machine Learning. PhD thesis, University of Bonn, published electronically (2005), available at: http://hss.ulb.uni-bonn.de/diss_online/math_nat_fak/2005/braun_mikio

  7. Vapnik, V.: Statistical Learning Theory. J. Wiley, Chichester (1998)

    Google Scholar 

  8. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  9. Williams, C.K.I., Rasmussen, C.E.: Gaussian processes for regression. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Advances in Neural Information Processing Systems 8. MIT Press, Cambridge (1996)

    Google Scholar 

  10. Wahba, G.: Spline Models For Observational Data. Society for Industrial and Applied Mathematics (1990)

    Google Scholar 

  11. Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42(3), 287–320 (2001); Also NeuroCOLT Technical Report NC-TR-1998-021

    Article  MATH  Google Scholar 

  12. Golub, G., Heath, M., Wahba, G.: Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics 21, 215–224 (1979)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Braun, M.L., Lange, T., Buhmann, J.M. (2006). Model Selection in Kernel Methods Based on a Spectral Analysis of Label Information. In: Franke, K., Müller, KR., Nickolay, B., Schäfer, R. (eds) Pattern Recognition. DAGM 2006. Lecture Notes in Computer Science, vol 4174. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11861898_35

Download citation

  • DOI: https://doi.org/10.1007/11861898_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44412-1

  • Online ISBN: 978-3-540-44414-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics