Skip to main content

Correlation Integral Decomposition for Classification

  • Conference paper
Book cover Artificial Neural Networks - ICANN 2008 (ICANN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5164))

Included in the following conference series:

Abstract

In this paper we show that the correlation integral can be decomposed into functions each related to a particular point of data space. For these functions, one can use similar polynomial approximations as used in the correlation integral. The essential difference is that the value of the exponent, which would correspond to the correlation dimension, differs in accordance to the position of the point in question. Moreover, we show that the multiplicative constant represents the probability density estimation at that point. This finding is used for the construction of a classifier. Tests with some data sets from the Machine Learning Repository show that this classifier can be very effective.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Camastra, F.: Data dimensionality estimation methods: a survey. Pattern Recognition 6, 2945–2954 (2003)

    Article  Google Scholar 

  2. Camastra, F., Vinciarelli, A.: Intrinsic Dimension Estimation of Data: An Approach based on Grassberger-Procaccia’s Algorithm. Neural Processing Letters 14(1), 27–34 (2001)

    Article  MATH  Google Scholar 

  3. Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Transactions on Information Theory IT-13(1), 21–27 (1967)

    Article  Google Scholar 

  4. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern classification, 2nd edn. John Wiley and Sons, Inc., New York (2000)

    Google Scholar 

  5. Dvorak, I., Klaschka, J.: Modification of the Grassberger-Procaccia algorithm for estimating the correlation exponent of chaotic systems with high embedding dimension. Physics Letters A 145(5), 225–231 (1990)

    Article  MathSciNet  Google Scholar 

  6. Gama, J.: Iterative Bayes. Theoretical Computer Science 292, 417–430 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  7. Grassberger, P., Procaccia, I.: Measuring the strangeness of strange attractors. Physica 9D, 189–208 (1983)

    MathSciNet  Google Scholar 

  8. Guerrero, A., Smith, L.A.: Towards coherent estimation of correlation dimension. Physics letters A 318, 373–379 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  9. Lev, N.: Hausdorff dimension. Student Seminar, Tel-Aviv University (2006), www.math.tau.ac.il/~levnir/files/hausdorff.pdf

  10. Merz, C. J., Murphy, P. M., Aha, D. W.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Science, Univ. of California, Irvine (1997), http://www.ics.uci.edu/~mlearn/MLSummary.html

  11. Osborne, A.R., Provenzale, A.: Finite correlation dimension for stochastic systems with power-law spectra. Physica D 35, 357–381 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  12. Paredes, R., Vidal, E.: Learning Weighted Metrics to Minimize NearestNeighbor Classification Error. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(7), 1100–1110 (2006)

    Article  Google Scholar 

  13. Takens, F.: On the Numerical Determination of the Dimension of the Attractor. In: Dynamical Systems and Bifurcations. Lecture Notes in Mathematics, vol. 1125, pp. 99–106. Springer, Berlin (1985)

    Chapter  Google Scholar 

  14. Weisstein, E. W.: Information Dimension. From MathWorld–A Wolfram Web Resource (2007), http://mathworld.wolfram.com/InformationDimension.html

  15. Friedmann, J.H.: Flexible Metric Nearest Neighbor Classification. Technical Report, Dept. of Statistics, Stanford University, p. 32 (1994)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Véra Kůrková Roman Neruda Jan Koutník

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jiřina, M., Jiřina, M. (2008). Correlation Integral Decomposition for Classification. In: Kůrková, V., Neruda, R., Koutník, J. (eds) Artificial Neural Networks - ICANN 2008. ICANN 2008. Lecture Notes in Computer Science, vol 5164. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87559-8_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87559-8_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87558-1

  • Online ISBN: 978-3-540-87559-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics