A novel hybrid teaching-learning-based optimization algorithm for the classification of data by using extreme learning machines

A novel hybrid teaching-learning-based optimization algorithm for the classification of data by using extreme learning machines

Data classification is the process of organizing data by relevant categories. In this way, the data can beunderstood and used more efficiently by scientists. Numerous studies have been proposed in the literature for the problem of data classification. However, with recently introduced metaheuristics, it has continued to be riveting to revisit this classical problem and investigate the efficiency of new techniques. Teaching-learning-based optimization (TLBO) is a recent metaheuristic that has been reported to be very effective for combinatorial optimization problems. In this study, we propose a novel hybrid TLBO algorithm with extreme learning machines (ELM) for the solution of data classification problems. The proposed algorithm (TLBO-ELM) is tested on a set of UCI benchmark datasets. The performance of TLBO-ELM is observed to be competitive for both binary and multiclass data classification problems compared with state-of-the-art algorithms.

___

  • [1] Mirhosseini M. A clustering approach using a combination of gravitational search algorithm and k-harmonic means and its application in text document clustering. Turkish Journal of Electrical Engineering & Computer Sciences 2017; 25 (2): 1251-1262. doi: 10.3906/elk-1508-31
  • [2] Nazar NB, Senthilkumar R. An online approach for feature selection for classification in big data. Turkish Journal of Electrical Engineering & Computer Sciences 2017; 25 (1): 163-171. doi:10.3906/elk-1501-98
  • [3] Feng G, Qian Z, Zhang X. Evolutionary selection extreme learning machine optimization for regression. Soft Computing 2012; 16 (9): 1485-1491. doi:10.1007/s00500-012-0823-7
  • [4] Guyon I, Elisseeff A. An introduction to variable and feature selection. Journal of Machine Learning Research 2003; 3: 1157-1182. doi: 10.1162/153244303322753616
  • [5] Deniz A, Kiziloz HE, Dokeroglu T, Cosar A. Robust multiobjective evolutionary feature subset selection algorithm for binary classification using machine learning techniques. Neurocomputing 2017; 241: 128-146. doi: 10.1016/j.neucom.2017.02.033
  • [6] Rao RV, Savsani VJ, Vakharia DP. Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems. Computer-Aided Design 2011; 43 (3): 303-315. doi: 10.1016/j.cad.2010.12.015
  • [7] Dokeroglu T. Hybrid teaching–learning-based optimization algorithms for the quadratic assignment problem. Computers & Industrial Engineering 2015; 85: 86-101. doi: 10.1016/j.cie.2015.03.001
  • [8] Lan Y, Soh YC, Huang GB. Two-stage extreme learning machine for regression. Neurocomputing 2010; 73: 3028– 3038. doi: 10.1016/j.neucom.2010.07.012
  • [9] Jain A, Zongker D. Feature selection: evaluation, application, and small sample performance. IEEE Transactions on Pattern Analysis and Machine Intelligence 1997; 19: 153–158. doi: 10.1109/34.574797
  • [10] Kohavi R, John GH. Wrappers for feature subset selection. Artificial Intelligence 1997; 1-2: 273–324. doi: 10.1016/S0004-3702(97)00043-X
  • [11] Zhu Z, Ong YS, Dash M. Wrapper-filter feature selection algorithm using a memetic framework. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 2007; 37: 70–76. doi: 10.1109/TSMCB.2006.883267
  • [12] Xue X, Yao M, Wu Z. A novel ensemble-based wrapper method for feature selection using extreme learning machine and genetic algorithm. Knowledge and Information Systems 2018; 57 (2): 389-412. doi: 10.1007/s10115-017-1131-4
  • [13] Kashef S, Nezamabadi-Pour H. An advanced ACO algorithm for feature subset selection. Neurocomputing 2015; 147: 271-279. doi: 10.1016/j.neucom.2014.06.067
  • [14] Unler A, Murat A. A discrete particle swarm optimization method for feature selection in binary classification problems. European Journal of Operational Research 2010; 206 (3): 528-539. doi: 10.1016/j.ejor.2010.02.032
  • [15] Crepinsek M, Liu SH, Mernik L. A note on teaching–learning-based optimization algorithm. Information Sciences 2012; 212: 79-93. doi: 10.1016/j.ins.2012.05.009
  • [16] Huang GB, Zhou H, Ding X. Extreme learning machine for regression and multiclass classification. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 2012; 42: 513–529. doi: 10.1109/TSMCB.2011.2168604
  • [17] Huang GB, Ding X, Zhou H. Optimization method based extreme learning machine for classification. Neurocomputing 2010; 74: 757-754. doi: 10.1016/j.neucom.2010.02.019
  • [18] Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing 2006; 70 (1): 489–501. 10.1016/j.neucom.2005.12.126
  • [19] Penrose R. A generalized inverse for matrices. Mathematical Proceedings of the Cambridge Philosophical Society 1955; 51: 406-413. doi: 10.1017/S0305004100030401
  • [20] Bryll R, Gutierrez-Osuna R, Quek F. Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition 2003; 36 (6): 1291–1302. doi: 10.1016/S0031-3203(02)00121-8
  • [21] Sun S. A survey of multi-view machine learning. Neural Computing and Applications 2013; 23 (7-8): 2031-2038. doi: 10.1007/s00521-013-1362-6
  • [22] Ho TK. The random subspace method for constructing decision forests. IEEE Transactions Pattern Analysis and Machine Intelligence 1998; 20 (8): 832–844. doi: 10.1109/34.709601
  • [23] Hall MA. Correlation-based feature selection for machine learning. PhD, University of Waikato, Hamilton, New Zealand, 1999.
  • [24] Quinlan JR. Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 1996; 4: 77-90. doi: 10.1613/jair.279
  • [25] Garcia-Nieto J, Alba E, Jourdan L, Talbi E. Sensitivity and specificity based multiobjective approach for feature selection: application to cancer diagnosis. Information Processing Letters 2009; 109 (16): 887-896. doi: 10.1016/j.ipl.2009.03.029
  • [26] Alexandre E, Cuadra L, Salcedo-Sanz S, Pastor-Sánchez A, Casanova-Mateo C. Hybridizing extreme learning machines and genetic algorithms to select acoustic features in vehicle classification applications. Neurocomputing 2015; 152: 58–68. doi: 10.1016/j.neucom.2014.11.019