Abstract
Kernel Principal Component Analysis (KPCA) has proven to be a versatile tool for unsupervised learning, however at a high computational cost due to the dense expansions in terms of kernel functions. We overcome this problem by proposing a new class of feature extractors employing ℓ1 norms in coefficient space instead of the Reproducing Kernel Hilbert Space in which KPCA was originally formulated in. Moreover, the modified setting allows us to efficiently extract features which maximize criteria other than the variance in a way similar to projection pursuit.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Aizerman, M. A., Braverman, É. M., Rozonoér, L. I. (1964): Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, 25, 821–837.
Burges, C. J. C., Schölkopf, B. (1997): Improving the accuracy and speed of support vector learning machines. In: M. Mozer, M. Jordan, T. Petsche (eds.), Advances in Neural Information Processing Systems 9, 375–381, Cambridge, MA. MIT Press.
Chen, S., Donoho, D., Saunders, M. (1999): Atomic decomposition by basis pursuit. Siam Journal of Scientific Computing, 20(1), 33–61.
Jolliffe, I. T. (1986): Principal Component Analysis. Springer-Verlag, New York, New York.
Klinke, S. (1995): Exploratory Projection Pursuit — The multivariate and discrete case. Discussion Paper 70, SFB 373, Humboldt-University of Berlin.
Mallat, S., Zhang, Z. (1993): Matching Pursuit in a time-frequency dictionary. IEEE Transactions on Signal Processing, 41, 3397–3415.
Mangasarian, O. L. (1965): Linear and Nonlinear Separation of Patterns by Linear Programming. Operations Research, 13, 444–452.
Rockafellar, R. T. (1970): Convex Analysis, vol. 28 of Princeton Mathematics Series. Princeton University Press.
SCHölkopf, B., MIKA, S., BURGES, C., KNIRSCH, P., MÜLLER, K.-R., RÄTSCH, G., SMOLA, A. (1999): Input Space vs. Feature Space in Kernel-Based Methods. IEEE Transactions on Neural Networks, 10(5), 1000–1017.
Schölkopf, B., Smola, A., Müller, K.-R. (1998): Nonlinear component analysis as a kernel Eigenvalue problem. Neural Computation, 10, 1299–1319.
Smola, A., Schölkopf, B., MÜLler, K.-R. (1998): The Connection between Regularization Operators and Support Vector Kernels. Neural Networks, 11, 637–649.
Smola, A. J., Schökopf, B. (2000): Sparse Greedy Matrix Approximation for Machine Learning. In: P. Langley (ed.), Proceedings of the 17th International Conference on Machine Learning, 911–918, San Francisco. Morgan Kaufman.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Smola, A.J., Mangasarian, O.L., Schölkopf, B. (2002). Sparse Kernel Feature Analysis. In: Gaul, W., Ritter, G. (eds) Classification, Automation, and New Media. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-55991-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-55991-4_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43233-3
Online ISBN: 978-3-642-55991-4
eBook Packages: Springer Book Archive