A Weighted Similarity Measure for k-Nearest Neighbors Algorithm

One of the most important problems in machine learning, which has gained importance in recent years, is classification. The k-nearest neighbors (kNN) algorithm is widely used in classification problem because it is a simple and effective method. However, there are several factors affecting the performance of kNN algorithm. One of them is determining an appropriate proximity (distance or similarity) measure. Although the Euclidean distance is often used as a proximity measure in the application of the kNN, studies show that the use of different proximity measures can improve the performance of the kNN. In this study, we propose the Weighted Similarity k-Nearest Neighbors algorithm (WS-kNN) which use a weighted similarity as proximity measure in the kNN algorithm. Firstly, it calculates the weight of each attribute and similarity between the instances in the dataset. And then, it weights similarities by attribute weights and creates a weighted similarity matrix to use as proximity measure. The proposed algorithm is compared with the classical kNN method based on the Euclidean distance. To verify the performance of our algorithm, experiments are made on 10 different real-life datasets from the UCI (UC Irvine Machine Learning Repository) by classification accuracy. Experimental results show that the proposed WS-kNN algorithm can achieve comparative classification accuracy. For some datasets, this new algorithm gives highly good results. In addition, we demonstrated that the use of different proximity measures can affect the classification accuracy of kNN algorithm.

___

  • 1. Jordan, MI, Mitchell, TM. 2015. Machine learning: Trends, perspectives, and prospects. Science; 349(6245): 255-260.
  • 2. Singh, A, Thakur, N, Sharma, A. A review of supervised machine learning algorithms. In 2016 3rd International Conference on Computing for Sustainable Global Development (INDIACom), IEEE, March 2016, pp. 1310-1315.
  • 3. Cekik, R, Telceken, S. 2018. A new classification method based on rough sets theory. Soft Computing; 22(6): 1881-1889.
  • 4. Soofi, AA, Awan, A. 2017. Classification Techniques in Machine Learning: Applications and Issues. Journal of Basic and Applied Sciences; 13: 459-465.
  • 5. Aggarwal, CC. 2014. Instance-Based Learning: A Survey. Data Classification: Algorithms and Applications, 157.
  • 6. Angiulli, F, Narvaez, E. 2018. Pruning strategies for nearest neighbors competence preservation learners. Neurocomputing; 308: 8-20.
  • 7. Prasath, VB, Alfeilat, HAA, Lasassmeh, O, Hassanat, A. 2017. Distance and Similarity Measures Effect on the Performance of K-Nearest Neighbors Classifier-A Review. arXiv preprint arXiv:1708.04321.
  • 8. Lei, Y, Zuo, MJ. 2009. Gear crack level identification based on weighted K nearest neighbors classification algorithm. Mechanical Systems and Signal Processing; 23(5): 1535-1547.
  • 9. Khateeb, N, Usman, M. Efficient Heart Disease Prediction System using K-Nearest Neighbors Classification Technique, In Proceedings of the International Conference on Big Data and Internet of Thing ACM, December 2017, pp. 21-26.
  • 10. Li, Q, Li, W, Zhang, J, Xu, Z. 2018. An improved k-nearest-neighbors method to diagnose breast cancer. Analyst; 143(12): 2807-2811.
  • 11. Liu, Y, Wang, X, Yan, K. 2018. Hand gesture recognition based on concentric circular scan lines and weighted K-nearest neighbors algorithm. Multimedia Tools and Applications; 77(1): 209-223.
  • 12. Rodrigues, ÉO. 2018. Combining Minkowski and Cheyshev: New distance proposal and survey of distance metrics using k-nearest neighbours classifier. Pattern Recognition Letters; 110: 66-71.
  • 13. Mulak, P, Talhar, N. 2015. Analysis of Distance Measures Using K-Nearest Neighbors Algorithm on KDD Dataset. International Journal of Science and Research; 4(7): 2101-2104.
  • 14. Hu, LY, Huang, MW, Ke, SW, Tsai, CF. 2016. The distance function effect on k-nearest neighbors classification for medical datasets. SpringerPlus; 5(1): 1304.
  • 15. Dialameh, M, Jahromi, MZ. 2017. A general feature-weighting function for classification problems. Expert Systems with Applications; 72: 177-188.
  • 16. Jiao, L, Pan, Q, Feng, X, Yang, F. An evidential k-nearest neighbors classification method with weighted attributes, In Proceedings of the 16th International Conference on Information Fusion, IEEE, July 2013, pp. 145-150.
  • 17. Marchiori, E. Class dependent feature weighting and k-nearest neighbors classification, In IAPR International Conference on Pattern Recognition in Bioinformatics, Springer, Berlin, Heidelberg, 2013, June, pp. 69-78.
  • 18. Hassanat, AB. 2014. Dimensionality invariant similarity measure. Journal of American Science; 10(8).
  • 19. Alkasassbeh, M, Altarawneh, GA, Hassanat, A. 2015. On enhancing the performance of nearest neighbour classifiers using hassanat distance metric. Canadian Journal of Pure and Applied Sciences (CJPAS); 9(1).
  • 20. Chomboon, K, Chujai, P, Teerarassamee, P, Kerdprasop, K, Kerdprasop, N. An empirical study of distance metrics for k-nearest neighbors algorithm, In Proceedings of the 3rd International Conference on Industrial Application Engineering, March 2015.
  • 21. Kayaalp, N, Arslan, G. 2014. A Fuzzy Bayesian Classifier with Learned Mahalanobis Distance. International Journal of Intelligent Systems; 29(8): 713-726.
  • 22. Kayaalp, N. Arslan, G. A New Fuzzy Bayesian Classification Approach, The 4th International Fuzzy Systems Symposium, İstanbul, 5-6 November 2015.
  • 23. Greco, S, Matarazzo, B, Slowinski, R. 2001. Rough sets theory for multicriteria decision analysis. European journal of operational research; 129(1): 1-47.