Skip to main content

k-NN-SSc: An Effective Similarity Score for k-NN Classifier

  • Conference paper
  • First Online:
Advanced Computational and Communication Paradigms (ICACCP 2023)

Abstract

This paper proposes a new similarity score known as k-NN-SSc that is incorporated into the traditional k-NN classifier. The proposed similarity score computes the similarity between two instances, and the score is used to find the k-nearest neighbors of an unknown instance for the k-NN classifier. The effectiveness of the proposed k-NN-SSc similarity score measure is compared with Euclidean, Minkowski, Manhattan, Chebyshev, and Cosine distance measures. From the experimental results, we observed that the proposed measure performs better as compared to other competing measures in terms of Accuracy, F1 Score, and Matthew’s Correlation Coefficient (MCC) on 20 UCI repository datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Guo G et al (2003) KNN model-based approach in classification. In: OTM confederated international conferences on the move to meaningful internet systems. Springer, Berlin, Heidelberg

    Google Scholar 

  2. Bhattacharya G, Ghosh K, Chowdhury AS (2012) An affinity-based new local distance function and similarity measure for kNN algorithm. Pattern Recogn Lett 33(3):356–363

    Article  Google Scholar 

  3. Dimensionality Invariant Similarity Measure, Ahmad Basheer Hassanat. arXiv:1409.0923 [cs.LG]

  4. Ougiaroglou S, Nanopoulos A, Papadopoulos AN, Manolopoulos Y, WelzerDruzovec T (2007) Adaptive k-nearest-neighbor classification using a dynamic number of nearest neighbors. In: East European conference on advances in databases and information systems. Springer, pp 66–82

    Google Scholar 

  5. Zhong XF, Guo SZ, Gao L, Shan H, Zheng JH (2017) An improved k-NN classification with dynamic k. In: Proceedings of the 9th international conference on machine learning and computing, pp 211–216

    Google Scholar 

  6. https://machinelearningmastery.com/distance-measures-for-machine-learning/

  7. Soucy P, Mineau GW (2001) A simple KNN algorithm for text categorization. In: Proceedings of the IEEE international conference on data mining, pp 647–648. https://doi.org/10.1109/ICDM.2001.989592

  8. Moutafis P, Leng M, Kakadiaris IA (2017) An overview and empirical comparison of distance metric learning methods. IEEE Trans Cybern 47(3):612–625. https://doi.org/10.1109/TCYB.2016.2521767

  9. Prasatha VS et al (2017) Effects of distance measure choice on KNN classifier performance-a review. arXiv:1708.04321

  10. Kahraman HT (2016) A novel and powerful hybrid classifier method: development and testing of heuristic k-nn algorithm with fuzzy distance metric. Data Knowl Eng 103:44–59

    Article  Google Scholar 

  11. Gao Y et al (2012) A novel two-level nearest neighbor classification algorithm using an adaptive distance metric. Knowl Based Syst 26:103–110

    Google Scholar 

  12. Bilge HŞ, Kerimbekov Y, Uğurlu HH (2015) A new classification method by using Lorentzian distance metric. In: International symposium on innovations in intelligent systems and applications (INISTA), pp 1–6. https://doi.org/10.1109/INISTA.2015.7276764

  13. Wang F, Sun J (2015) Survey on distance metric learning and dimensionality reduction in data mining. Data Min Knowl Disc 29(2):534–564

    Article  MathSciNet  MATH  Google Scholar 

  14. Hu L-Y et al (2016) The distance function effect on k-nearest neighbor classification for medical datasets. SpringerPlus 5(1):1–9

    Google Scholar 

  15. Suárez JL, García S, Herrera F (2021) A tutorial on distance metric learning: mathematical foundations, algorithms, experimental analysis, prospects and challenges. Neurocomputing 425: 300–322

    Google Scholar 

  16. Todeschini R et al (2015) N3 and BNN: two new similarity based classification methods in comparison with other classifiers. J Chem Inf Model 55(11):2365–2374

    Google Scholar 

  17. Geng Y et al (2018) RECOME: a new density-based clustering algorithm using relative KNN kernel density. Inf Sci 436:13–30

    Google Scholar 

  18. Gerhana YA et al (2017) The implementation of K-nearest neighbor algorithm in case-based reasoning model for forming automatic answer identity and searching answer similarity of algorithm case. In: 2017 5th international conference on cyber and IT service management (CITSM). IEEE

    Google Scholar 

  19. Kumar P, Raju BS, Radha Krishna P (2010) A new similarity metric for sequential data. Int J Data Warehous Min (IJDWM) 6(4):16–32

    Google Scholar 

  20. Jiang S et al (2012) An improved K-nearest-neighbor algorithm for text categorization. Exp Syst Appl 39(1):1503–1509

    Google Scholar 

  21. Zhang S et al (2017) Efficient kNN classification with different numbers of nearest neighbors. IEEE Trans Neural Netw Learn Syst 29(5):1774–1785

    Google Scholar 

  22. Yao Z, Ruzzo WL (2006) A regression-based K nearest neighbor algorithm for gene function prediction from heterogeneous data. BMC Bioinf 7(1). BioMed Central

    Google Scholar 

  23. Wang B, Liao Q, Zhang C (2013) Weight-based KNN recommender system. In: 2013 5th international conference on intelligent human-machine systems and cybernetics, vol 2. IEEE, 2013

    Google Scholar 

  24. Lim HS (2004) Improving kNN-based text classification with well-estimated parameters. In: International conference on neural information processing. Springer, Berlin, Heidelberg

    Google Scholar 

  25. Yean CW et al (2018) Analysis of the distance metrics of KNN classifier for EEG signal in stroke patients. In: 2018 international conference on computational approach in smart systems design and applications (ICASSDA). IEEE

    Google Scholar 

  26. Shekhar S, Hoque N, Bhattacharyya DK (2022) PKNN-MIFS: a parallel KNN classifier over an optimal subset of features. In: Intelligent systems with applications, vol 14. Elsevier

    Google Scholar 

  27. Hoque N, Bhattacharyya DK, Kalita JK (2014) MIFS-ND: a mutual information-based feature selection method. Expert Syst Appl 41(14):6371–6385

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robindro Singh Khumukcham .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Khumukcham, R.S., Takhellambam, L., Urikhimbam, B.C., Yambem, R., Hoque, N. (2023). k-NN-SSc: An Effective Similarity Score for k-NN Classifier. In: Borah, S., Gandhi, T.K., Piuri, V. (eds) Advanced Computational and Communication Paradigms . ICACCP 2023. Lecture Notes in Networks and Systems, vol 535. Springer, Singapore. https://doi.org/10.1007/978-981-99-4284-8_4

Download citation

Publish with us

Policies and ethics