A Geometrical Modification of Learning Vector Quantization Method for Solving Classification Problems

ö?grenme kuralı ile ortaya çıkan problemin üstesinden gelmek için geometrik bir yakla¸sım önerilmi¸stir. Veri nicemlemesinin izah edilebilmesi için prototip vektörlerininhiper-küreler üzerinde döndürülmesi esasına dayalı teorik bir metodoloji geli¸stirilmi¸stir.Önerilen ö?grenme algoritması UCI veri havuzundan alınmı¸s bir tanesi ikili sınıflandırmaveri seti ve iki tanesi çok sınıflı olan veri setleri ile bir tanesi tarafımızdan hazırlanançoklu sınıf veriseti üzerinde test edilmi¸s ve geçerlili?gi denetlenmi¸stir. Önerilen metot,literatürde referans alınan bazı destekleyici ö?grenmeli vektör nicemleme a?g varyasyonlarıile farklı alanlarda kar¸sıla¸stırılmı¸stır. Çok sayıdaki deneysel çalı¸smalar, makul do?grulukde?gerleri ve makro f1skorları ile önerilen algoritmanın performansını do?grulamaktadır

Sınıflandırma Problemlerinin Çözümü için Destekleyici Ö?grenmeli Vektör Nicemleme Metodunun Geometrik Modifikasyonu

In this paper, a geometrical scheme is presented to show how to overcomean encountered problem arising from the use of generalized delta learning rule withincompetitive learning model. It is introduced a theoretical methodology for describing thequantization of data via rotating prototype vectors on hyper-spheres.The proposed learning algorithm is tested and verified on different multidimensionaldatasets including a binary class dataset and two multiclass datasets from the UCI repository, and a multiclass dataset constructed by us. The proposed method is compared withsome baseline learning vector quantization variants in literature for all domains. Largenumber of experiments verify the performance of our proposed algorithm with acceptableaccuracy and macro f1scores

___

  • Watanabe, S. 2009. Algebraic Geometry and Sta- tistical Learning Theory. Cambridge Monographs on Applied and Computational Mathematics, Cambridge University Press.
  • John, G. 1993. Geometry-Based Learning Algo- rithms.
  • Kim, J.H., and Park, S. 1995. The geometrical learn- ing of binary neural networks. IEEE Transactions on Neural Networks, 6(1), 237-247.
  • Cabrelli, C., Molter, U. and Shonkwiler, R. 2000. A constructive algorithm to solve "convex recursive deletion" (CoRD) classification problems via two- layer perceptron networks. EEE Transactions on Neu- ral Networks Learning Systems, 11(3), 811-816.
  • Wang, D. and Chaudhari, N.S. 2004. An approach for construction of Boolean neural networks based on ge- ometrical expansion. Neurocomputing, 57, 455-461.
  • Shoujue, W. and Jiangliang, L. 2005. Geometrical learning, descriptive geometry, and biomimetic pat- tern recognition. Neurocomputing, 67, 9-28.
  • Bayro-Corrochano, E. and Anana-Daniel, N. 2005. MIMO SVMs for classification and regression using the geometric algebra framework. Proceedings of the International Joint Conference on Neural Networks, 895-900.
  • Zhang, D., Chan, X. and Lee, WS. 2005. Text clas- sificitaion with kernels on the multinomial manifold. Proceedings of the 28th annual international ACM SI- GIR conference on Research and development in in- formation retrieval, 266-273
  • Delogu, R., Fanni, A. and Montisci, A. 2008. Ge- ometrical synthesis of MLP neural networks. Neuro- computing, 71 (4-6), 919-930.
  • Liu, Z., Liu, J.G., Pan, C. and Wang, G. 2009. A novel geometric approach to binary classification based on scaled convex hulls. IEEE Transactions on Neural Networks, 20(7), 1215-1220.
  • Nova, D. and Est´evez, P.A. 2014. A review of learn- ing vector quantization classifiers. Neural Computing and Applications, 25(3-4), 511-524.
  • Kaden, M., Lange M., Nebel, D., Riedel, M. , Geweniger, T. and Villmann, T. 2014. Aspects in Classification Learning - Review of Recent Develop- ments in Learning Vector Quantization. Foundations of Computing and Decision Sciences, 39(2), 79-105.
  • Kohonen T. 1990. The self-organizing map. Proceed- ings of the IEEE, 78(9), 1464-1480.
  • Kohonen, T.1990. Improved versions of learning vector quantization. Proceedings of the International Joint Conference on Neural Networks, 1, 545-550.
  • Kohonen T., Hynninen, J., Kangas, J., Laaksonen, J. and Torkkola, K. 1996. LVQ_PAK: The learning vector quantization program package. Helsinki Uni- versity of Technology, Laboratory of Computer and Information Science, Finland.
  • Sato, A.S. and Yamada, K. 1995. Generalized learn- ing vector quantization. In G. Tesauro, D. Touretzky and T. Leen (Eds.): Advances in Neural Information Processing Systems. MIT Press, 423-429.
  • Hammer, B. and Villmann, T.2002. Generalized rel- evance learning vector quantization. Neural Networks
  • New developments in self-organizing maps, 15(8-9), 1059-1068.
  • Seo, S. and Obermayer, K. 2003. Soft learning vec- tor quantization. Neural Computation, 15(7), 1589- 1604.
  • Biehl, M., Ghosh,A., Hammer,B. and Bengio,Y. 2006. Dynamics and Generalization Ability of LVQ
  • Algorithms. The Journal of Machine Learning Re- search, 8, 323-360.
  • Schneider, P., Hammer,B. and Biehl, M. 2009. Adap- tive relevance matrices in learning vector quantiza- tion. Neural Computation, 21, 3532-3561.
  • Kaestner, M., Hammer, B., Biehl, M. and Villmann, T. 2012. Functional relevance learning in general ized learning vector quantization. Neurocomputing, 90, 79-105.
  • Hammer, B., Hofmann, D., Schleif, F.M. and Zhu, X. 2014. )similarities. Neurocomputing, 131, 43-51.
  • Bohnsack, A., Domaschke, K., Kaden, M., Lange, M. and Villmann, T. Learning matrix quantization and relevance learning based on Schatten-p-norms. Neurocomputing, 192, 104-114.
  • Bohnsack, A., Domaschke, K., Kaden, M., Lange, M. and Villmann, T. 2016. Learning matrix quan- tization and relevance learning based on Schatten-p- norms. Neurocomputing, 192, 104-114.
  • Buchala, S., Davey, N., Gale, T.M. and Frank, R.J. 2005. Analysis of linear and nonlinear dimension- ality reduction methods for gender classification of face images. International Journal of Systems Sci- ence, 36(14), 931-942.
  • Maeda, S. and Ishii, S. 2009. Learning a multi- dimensional companding function for lossy source coding. Neural Networks, 22(7), 998-1010.
  • Denil, M., Shakibi, B., Dinh, L., Ranzato, M. and Freitas, N. 2013. Learning. In CJC. Burges, L. Bottou , M. Welling , Z. Ghahramani and KQ. Weinberger (Eds.): Ad- vances in Neural Information Processing Systems, 26, 2148-2156.
  • Bache, K. and Lichman, M. 2014. UCI Repos- itory of Machine Learning Databases. Irvine CA: University of California. School of Infor- mation and Computer Science. Available on:
  • Fisher, R.A. 1936. The use of multiple measure- ments in taxonomic problems. Annals of Eugenics, 2, 179-188.
  • Wolberg, W.H., Street, W.N., Heisey, D.M. and Man- gasarian, O.L. 1995. Computer-derived nuclear fea- tures distinguish malignant from benign breast cytol- ogy. Human Pathology, 26, 792-796.
  • Alpaydin, E. and Kaynak, C. 1998. Cascaded Clas- sifiers. Kybernetika.
  • Kaynak, C. 1995. Methods of Combining Multi- ple Classifiers and Their Applications to Handwritten Digit Recognition. MSc Thesis. Institute of Graduate Studies in Science and Engineering. Bogazici Univer- sity.