Learning target class eigen subspace (LTC-ES) via eigen knowledge grid

Learning target class eigen subspace (LTC-ES) via eigen knowledge grid

In one-class classification (OCC) tasks, only the target class (class-of-interest (CoI)) samples are well defined during training, whereas the other class samples are totally absent. In OCC algorithms, the high dimensional data adds computational overhead apart from its intrinsic property of curse of dimensionality. For target class learning, conventional dimensionality reduction (DR) techniques are not suitable due to negligence of the unique statistical properties of CoI samples. In this context, the present research proposes a novel target class guided DR technique to extract the eigen knowledge grid that contains the most promising eigenvectors of variance-covariance matrix of CoI samples. In this process the lower and higher eigenvalued eigenvectors are rejected via statistical analysis because the high variance may split the target class itself, whereas the lower variance do not contribute significant information. Furthermore, the identified eigen knowledge grid is utilized to transform high dimensional samples to the lower dimensional eigen subspace. The proposed approach is named as learning target class eigen subspace (LTS-ES) that ensures strong separation of the target class from other classes. To show the effectiveness of transformed lower dimensional eigen subspace, oneclass support vector machine (OCSVM) has been experimented on wide variety of benchmark datasets in presence of: original feature space, transformed features obtained via eigenvectors of approximately 80%–90% cumulative variance, transformed features obtained via knowledge grid and transformed features obtained via eigenvectors of approximately 50% cumulative variance. Finally, a new performance measure parameter called stability factor is introduced to validate the robustness of the proposed approach.

___

  • [1] Minter T. Single-class classification. In: LARS Symposia; 1975. p. 54.
  • [2] Alam S, Sonbhadra SK, Agarwal S, Nagabhushan P. One-class support vector classifiers: A survey. KnowledgeBased Systems. 2020:105754.
  • [3] Chalapathy R, Chawla S. Deep learning for anomaly detection: A survey. arXiv preprint arXiv:190103407. 2019.
  • [4] Pimentel MA, Clifton DA, Clifton L, Tarassenko L. A review of novelty detection. Signal Processing. 2014;99:215-49.
  • [5] Bellman R. Dynamic programming. Princeton University Press, New Jersey. 1957;8.
  • [6] Alam S, Sonbhadra SK, Agarwal S, Nagabhushan P, Tanveer M. Sample reduction using farthest boundary point estimation (FBPE) for support vector data description (SVDD). Pattern Recognition Letters. 2020;131:268-76.
  • [7] Johnstone IM, Titterington DM. Statistical challenges of high-dimensional data. The Royal Society Publishing; 2009.
  • [8] Sonbhadra SK, Agarwal S, Nagabhushan P. Learning Target Class Feature Subspace (LTC-FS) Using Eigenspace Analysis and N-ary Search-Based Autonomous Hyperparameter Tuning for OCSVM. International Journal of Pattern Recognition and Artificial Intelligence. 2021:2151015.
  • [9] Houari R, Bounceur A, Kechadi MT, Tari AK, Euler R. Dimensionality reduction in data mining: A Copula approach. Expert Systems with Applications. 2016;64:247-60.
  • [10] Pal M, Foody GM. Feature selection for classification of hyperspectral data by SVM. IEEE Transactions on Geoscience and Remote Sensing. 2010;48(5):2297-307.
  • [11] Duch W. Filter methods. In: Feature Extraction. Springer; 2006. p. 89-117.
  • [12] Kohavi R, John GH. Wrappers for feature subset selection. Artificial intelligence. 1997;97 (1-2):273-324.
  • [13] Van Der Maaten L, Postma E, Van den Herik J. Dimensionality reduction: a comparative. J Mach Learn Res. 2009;10 (66-71):13.
  • [14] Huang X, Wu L, Ye Y. A Review on Dimensionality Reduction Techniques. International Journal of Pattern Recognition and Artificial Intelligence. 2019;33 (10):1950017.
  • [15] Gauch Jr HG. Noise reduction by eigenvector ordinations. Ecology. 1982;63 (6):1643-9.
  • [16] Josse J, Husson F. Selecting the number of components in principal component analysis using cross-validation approximations. Computational Statistics & Data Analysis. 2012;56 (6):1869-79.
  • [17] Kim PJ, Chang HJ, Choi JY. Fast incremental learning for one-1351 SONBHADRA et al./Turk J Elec Eng & Comp Sciclass support vector classifier using sample margin information. In: 2008 19th International Conference on Pattern Recognition. IEEE; 2008. p. 1-4.
  • [18] Chandrashekar G, Sahin F. A survey on feature selection methods. Computers & Electrical Engineering. 2014;40 (1):16-28.
  • [19] Björklund M. Be careful with your principal components. Evolution. 2019;73 (10):2151-8.
  • [20] Nguyen LH, Holmes S. Ten quick tips for effective dimensionality reduction. PLoS computational biology. 2019;15 (6).
  • [21] Ferré L. Selection of components in principal component analysis: a comparison of methods. Computational Statistics & Data Analysis. 1995;19 (6):669-82.
  • [22] Choi Y, Taylor J, Tibshirani R. Selecting the number of principal components: Estimation of the true rank of a noisy matrix. The Annals of Statistics. 2017;45 (6):2590-617.
  • [23] Peres-Neto PR, Jackson DA, Somers KM. How many principal components? Stopping rules for determining the number of non-trivial axes revisited. Computational Statistics & Data Analysis. 2005;49 (4):974-97.
  • [24] Tax DM, Müller KR. Feature extraction for one-class classification. In: Artificial Neural Networks and Neural Information Processing—ICANN/ICONIP 2003. Springer; 2003. p. 342-9.
  • [25] Heskes T. Bias/variance decompositions for likelihood-based estimators. Neural Computation. 1998;10 (6):1425-33.
  • [26] Cho SB. Recognition of unconstrained handwritten numerals by doubly self-organizing neural network. In: Proceedings of 13th International Conference on Pattern Recognition. vol. 4. IEEE; 1996. p. 426-30.
  • [27] Lian H. On feature selection with principal component analysis for one-class SVM. Pattern Recognition Letters. 2012;33 (9):1027-31.
  • [28] Feng K, Jiang Z, He W, Ma B. A recognition and novelty detection approach based on Curvelet transform, nonlinear PCA and SVM with application to indicator diagram diagnosis. Expert Systems with Applications. 2011;38 (10):12721-9.
  • [29] Jeong YS, Kang IH, Jeong MK, Kong D. A new feature selection method for one-class classification problems. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews). 2012;42 (6):1500-9.
  • [30] Nagabhushan P, Meenakshi H. Target class supervised feature subsetting. International Journal of Computer Applications. 2014;91 (12):11-23.
  • [31] Sonbhadra SK, Agarwal S, Nagabhushan P. Early-stage COVID-19 diagnosis in presence of limited posteroanterior chest X-ray images via novel Pinball-OCSVM. arXiv preprint arXiv:201008115. 2020.
  • [32] Tax DM, Duin RP. Support vector domain description. Pattern recognition letters. 1999;20 (11-13):1191-9.
  • [33] Schölkopf B, Platt JC, Shawe-Taylor J, Smola AJ, Williamson RC. Estimating the support of a high-dimensional distribution. Neural computation. 2001;13 (7):1443-71.
  • [34] Sonbhadra SK, Agarwal S, Nagabhushan P. Target specific mi1351 SONBHADRA et al./Turk J Elec Eng & Comp Scining of COVID-19 scholarly articles using one-class approach. Chaos, Solitons & Fractals. 2020;140:110155.
  • [35] Erfani SM, Rajasegarar S, Karunasekera S, Leckie C. High-dimensional and large-scale anomaly detection using a linear one-class SVM with deep learning. Pattern Recognition. 2016;58:121-34.
Turkish Journal of Electrical Engineering and Computer Sciences-Cover
  • ISSN: 1300-0632
  • Yayın Aralığı: Yılda 6 Sayı
  • Yayıncı: TÜBİTAK
Sayıdaki Diğer Makaleler

Offline tuning mechanism of joint angular controller for lower-limb exoskeleton with adaptive biogeographical-based optimization

Mohammad Soleimani Amiri, Rizauddin Ramli

Stochastic day-ahead optimal scheduling of multimicrogrids: an alternating direction method of multipliers (ADMM) approach

Amin Safari, Hossein Nasiraghdam

Software security management in critical infrastructures: a systematic literature review

Bedir TEKİNERDOĞAN, Gülsüm Ece EKŞİ, Cağatay CATAL

Learning target class eigen subspace (LTC-ES) via eigen knowledge grid

Sanjay Kumar Sonbhadra, Sonali Agarwal, P. Nagabhushan

Priority enabled content based forwarding in fog computing via SDN

Mehmet Demirci, Metehan Güzel, Suat Özdemir, Feyza Yıldırım Okay, Yasin İnağ

A hybrid acoustic-RF communication framework for networked control of autonomous underwater vehicles: design and cosimulation

Mehrullah SOOMRO, Özgur GÜRBÜZ, Saeed NOURIZADEH AZAR, Oytun ERDEMİR, Ahmet ONAT

Twitter account classification using account metadata: organization vs. individual

Yusuf Mücahit ÇETİNKAYA, Mesut GÜRLEK, İsmail Hakkı TOROSLU, Pınar KARAGÖZ

Binary flower pollination algorithm based user scheduling for multiuser MIMO systems

Prabina Pattanayak, Arnab Nandi, Krishna Lal Baishnab, Fazal Ahmed Talukdar, Jyoti Mohanty

Residential energy management system based on integration of fuzzy logic and simulated annealing

Suat BAYSAN, Ramazan Nejat TUNCAY, Ömer Cihan KIVANÇ, Salih Barış ÖZTÜRK, Bekir Tevfik AKGÜN, Semih BİLGEN

Evaluating the role of carbon quantum dots covered silica nanofillers on the partial discharge performance of transformer insulation

Kasi Viswanathan PALANISAMY, Chandrasekar SUBRAMANIAM, Balaji SAKTHIVEL