Skip to main content

Sparse Kernel Feature Analysis

  • Conference paper
Classification, Automation, and New Media

Abstract

Kernel Principal Component Analysis (KPCA) has proven to be a versatile tool for unsupervised learning, however at a high computational cost due to the dense expansions in terms of kernel functions. We overcome this problem by proposing a new class of feature extractors employing ℓ1 norms in coefficient space instead of the Reproducing Kernel Hilbert Space in which KPCA was originally formulated in. Moreover, the modified setting allows us to efficiently extract features which maximize criteria other than the variance in a way similar to projection pursuit.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Aizerman, M. A., Braverman, É. M., Rozonoér, L. I. (1964): Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, 25, 821–837.

    Google Scholar 

  • Burges, C. J. C., Schölkopf, B. (1997): Improving the accuracy and speed of support vector learning machines. In: M. Mozer, M. Jordan, T. Petsche (eds.), Advances in Neural Information Processing Systems 9, 375–381, Cambridge, MA. MIT Press.

    Google Scholar 

  • Chen, S., Donoho, D., Saunders, M. (1999): Atomic decomposition by basis pursuit. Siam Journal of Scientific Computing, 20(1), 33–61.

    Article  MathSciNet  MATH  Google Scholar 

  • Jolliffe, I. T. (1986): Principal Component Analysis. Springer-Verlag, New York, New York.

    Book  Google Scholar 

  • Klinke, S. (1995): Exploratory Projection Pursuit — The multivariate and discrete case. Discussion Paper 70, SFB 373, Humboldt-University of Berlin.

    Google Scholar 

  • Mallat, S., Zhang, Z. (1993): Matching Pursuit in a time-frequency dictionary. IEEE Transactions on Signal Processing, 41, 3397–3415.

    Article  MATH  Google Scholar 

  • Mangasarian, O. L. (1965): Linear and Nonlinear Separation of Patterns by Linear Programming. Operations Research, 13, 444–452.

    Article  MathSciNet  MATH  Google Scholar 

  • Rockafellar, R. T. (1970): Convex Analysis, vol. 28 of Princeton Mathematics Series. Princeton University Press.

    Google Scholar 

  • SCHölkopf, B., MIKA, S., BURGES, C., KNIRSCH, P., MÜLLER, K.-R., RÄTSCH, G., SMOLA, A. (1999): Input Space vs. Feature Space in Kernel-Based Methods. IEEE Transactions on Neural Networks, 10(5), 1000–1017.

    Article  Google Scholar 

  • Schölkopf, B., Smola, A., Müller, K.-R. (1998): Nonlinear component analysis as a kernel Eigenvalue problem. Neural Computation, 10, 1299–1319.

    Article  Google Scholar 

  • Smola, A., Schölkopf, B., MÜLler, K.-R. (1998): The Connection between Regularization Operators and Support Vector Kernels. Neural Networks, 11, 637–649.

    Article  Google Scholar 

  • Smola, A. J., Schökopf, B. (2000): Sparse Greedy Matrix Approximation for Machine Learning. In: P. Langley (ed.), Proceedings of the 17th International Conference on Machine Learning, 911–918, San Francisco. Morgan Kaufman.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Smola, A.J., Mangasarian, O.L., Schölkopf, B. (2002). Sparse Kernel Feature Analysis. In: Gaul, W., Ritter, G. (eds) Classification, Automation, and New Media. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-55991-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-55991-4_18

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43233-3

  • Online ISBN: 978-3-642-55991-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics