Skip to main content
Log in

A heuristic supervised Euclidean data difference dimension reduction for KNN classifier and its application to visual place classification

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose a novel supervised dimension reduction algorithm based on K-nearest neighbor (KNN) classifier. The proposed algorithm reduces the dimension of data in order to improve the accuracy of the KNN classification. This heuristic algorithm proposes independent dimensions which decrease Euclidean distance of a sample data and its K-nearest within-class neighbors and increase Euclidean distance of that sample and its M-nearest between-class neighbors. This algorithm is a linear dimension reduction algorithm which produces a mapping matrix for projecting data into low dimension. The dimension reduction step is followed by a KNN classifier. Therefore, it is applicable for high-dimensional multiclass classification. Experiments with artificial data such as Helix and Twin-peaks show ability of the algorithm for data visualization. This algorithm is compared with state-of-the-art algorithms in classification of eight different multiclass data sets from UCI collection. Simulation results have shown that the proposed algorithm outperforms the existing algorithms. Visual place classification is an important problem for intelligent mobile robots which not only deals with high-dimensional data but also has to solve a multiclass classification problem. A proper dimension reduction method is usually needed to decrease computation and memory complexity of algorithms in large environments. Therefore, our method is very well suited for this problem. We extract color histogram of omnidirectional camera images as primary features, reduce the features into a low-dimensional space and apply a KNN classifier. Results of experiments on five real data sets showed superiority of the proposed algorithm against others.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Feng Z, Yang M, Zhang L, Liu Y, Zhang D (2013) Joint discriminative dimensionality reduction and dictionary learning for face recognition. Pattern Recognit 46(8):2134–2143

    Article  Google Scholar 

  2. Wang J, Peng W, Ward MO, Rundensteiner EA (2003) Interactive hierarchical dimension ordering, spacing and filtering for exploration of high dimensional data sets. In: Information visualization, IEEE symposium, pp 105–112

  3. Villegas M (2011) Contributions to high-dimensional pattern recognition. PhD thesis, Universidad Politecnica de Valencia, Valencia

  4. Villegas M, Paredes R (2011) Dimensionality reduction by minimizing nearest-neighbor classification error. Pattern Recognit Lett 32(4):633–639

    Article  Google Scholar 

  5. Bishop CH (2006) Pattern recognition and machine learning, vol 1. Springer, New York

    MATH  Google Scholar 

  6. Hinrichs A, Novak E, Ullrich M, Wozniakowski H (2014) The curse of dimensionality for numerical integration of smooth functions II. J Complex 30(2):117–143

    Article  MathSciNet  MATH  Google Scholar 

  7. Lee J, Verleysen M (2007) Nonlinear dimensionality reduction. Springer, Berlin

    Book  MATH  Google Scholar 

  8. Oweis S, Lawrance K (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326

    Article  Google Scholar 

  9. Abdi H, Williams LJ (2010) Principal component analysis. Wiley Interdiscip Rev Comput Stat 2:433–459

    Article  Google Scholar 

  10. Xanthopoulos P, Pardalos PM, Trafalis TB (2013) Linear discriminant analysis. In: Robust data mining. Springer, New York, pp 27–33

    Chapter  Google Scholar 

  11. Zheng Z, Yang F, Tan W, Jia J, Yang J (2007) Gabor feature-based face recognition using supervised locality preserving projection. Signal Process 87(10):2473–2483

    Article  MATH  Google Scholar 

  12. Yan S, Xu D, Zhang B, Zhang HJ, Yang Q, Lin S (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29(1):40–51

    Article  Google Scholar 

  13. Bressan M, Vitria J (2003) Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recognit Lett 24(15):2743–2749

    Article  Google Scholar 

  14. Goldberger J, Roweis S, Hinton G, Salakhutdinov R (2005) Neighborhood components analysis. Advances in neural information processing systems. MIT Press, Cambridge, pp 513–520

    Google Scholar 

  15. Weinberger KQ, Blitzer J, Saul LK (2006) Distance metric learning for large margin nearest neighbor classification. In: Advances in neural information processing systems, vol 18, pp 1473–1480

  16. Hu Q, Zhu P, Yang Y, Yu D (2011) Large-margin nearest neighbor classifiers via sample weight learning. Neurocomputing 74(4):656–660

    Article  Google Scholar 

  17. Xu Y, Zhu Q, Fan Z, Qiu M, Chen Y, Liu H (2013) Coarse to fine K nearest neighbor classifier. Pattern Recognit Lett 34(9):980–986

    Article  Google Scholar 

  18. Van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9(2579–2605):85

    MATH  Google Scholar 

  19. Asuncion A, Newman D (2007) UCI machine learning repository. http://www.ics.uci.edu/mlearn/MLRepository.html

  20. Pronobis A, Caputo B (2009) COLD: the cosy localization database. Int J Robot Res 28(5):588–594. The COLD (CoSy Localization Database) database. http://cogvis.nada.kth.se/COLD/

  21. Hall P, Park BU, Samworth RJ (2008) Choice of neighbor order in nearest-neighbor classification. Ann Stat 36(5):2135–2152

    Article  MathSciNet  MATH  Google Scholar 

  22. Davis J, Mark G (2006) The relationship between Precision–Recall and ROC curves. In: Proceedings of the 23rd international conference on machine learning. ACM, pp 233–240

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saeed Shiry Ghidary.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Omranpour, H., Shiry Ghidary, S. A heuristic supervised Euclidean data difference dimension reduction for KNN classifier and its application to visual place classification. Neural Comput & Applic 27, 1867–1881 (2016). https://doi.org/10.1007/s00521-015-1979-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-015-1979-8

Keywords

Navigation