Optimal set of EEG features in infant sleep stage classification

This paper evaluates six classification algorithms to assess the importance of individual EEG rhythms in the context of automatic classification of infant sleep. EEG features were obtained by Fourier transform and by a novel technique based on the empirical mode decomposition and generalized zero crossing method. Of six evaluated classification algorithms, the best classification results were obtained with the support vector machine for the combination of all presented features from four EEG channels. Three methods of attribute ranking were assessed: relief, principal component analysis, and wrapper-based optimized attribute weights. The outcomes revealed that the optimal selection of features requires one feature from every significant frequency band, either a spectral feature or a frequency dynamic feature. This means that reducing the number of features will have a minimal impact on the classification accuracy.

___

  • Aboalayon KAI, Faezipour M, Almuhammadi WS, Moslehpour S. Sleep stage classification using EEG signal analysis: a comprehensive survey and new investigation. Entropy 2016; 18: 272.
  • Rechtschaffen A, Kales A. A Manual of Standardized Terminology, Techniques and Scoring System of Sleep Stages in Human Subjects. Bethesda, MD, USA: National Institutes of Health.
  • Lajnef T, Chaibi S, Ruby P, Aguera PE, Eichenlaub JB, Samet M, Kachouri A, Jerbi K. Learning machines and sleeping brains: automatic sleep stage classification using decision-tree multi-class support vector machines. J Neurosci Methods 2015; 250: 94-105.
  • Piryatinska A, Woyczynski WA, Scher MS, Loparo KA. Optimal channel selection for analysis of EEG-sleep patterns of neonates. Comput Meth Prog Bio 2012; 106: 14-26.
  • Hassan AR, Subasi A. A decision support system for automated identification of sleep stages from single-channel EEG signals. Knowl-Based Syst 2017; 128: 115-124.
  • Boostani R, Karimzadeh F, Torabi-Nami M. A comparative review on sleep stage classification methods in patients and healthy individuals. Comput Meth Prog Bio 2017; 140: 77-91.
  • Vilamala A, Madsen KH, Hansen LK. Deep convolutional neural networks for interpretable analysis of EEG sleep stage scoring. In: MLSP 2017; USA. New York, NY, USA: IEEE. pp. 1-6.
  • Šušmáková K, Krakovská A. Discrimination ability of individual measures used in sleep stages classification. Artif Intell Med 2008; 44: 261-277.
  • Krakovská A, Mezeiová K. Automatic sleep scoring: a search for an optimal combination of measures. Artif Intell Med 2011; 53: 25-33.
  • Şen B, Peker M, Çavuşoğlu A, Çelebi FVA. Comparative study on classification of sleep stage based on EEG signals using feature selection and classification algorithms. J Med Syst 2014; 38: 18.
  • Logesparan L, Casson AJ, Rodriguez-Villegas E. Optimal features for online seizure detection. Med Biol Eng Comput 2012; 50: 659-669.
  • Boostani R, Graimann B, Moradi MH, Pfurtscheller G. A comparison approach toward finding the best feature and classifier in cue-based BCI. Med Biol Eng Comput 2007; 45: 403-412.
  • Čić M, Šoda J, Bonković M. Automatic classification of infant sleep based on instantaneous frequencies in a single- channel EEG signal. Comput Biol Med 2013; 43: 2110-2117.
  • Gerla V, Paul K, Lhotska L, Krajca V. Multivariate analysis of full-term neonatal polysomnographic data. IEEE T Inf Technol Biomed 2009; 13: 104-110.
  • Huang NE, Shen Z, Long S, Wu M., Shih H, Zheng Q, Yen NC, Tung C, Liu H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and nonstationary time series analysis. Proc R Soc Lond A 1998; 454: 903-995.
  • Huang NE. Computing frequency by using generalized zero-crossing applied to intrinsic mode functions. United States Patent No. US 6990436 B1, 2006.
  • Cortes C, Vapnik V. Support vector networks. Mach Learn 1995; 20: 273-297.
  • Breiman L. Random forests. Mach Learn 2001; 45: 5-32.
  • Caudill M. Neural network primer: Part I. AI Expert 1989; 2: 46-52.
  • Aha DW, Kibler D, Albert MK. Instance-based learning algorithms. Mach Learn 1991; 6: 37-66.
  • John GH, Langley P. estimating continuous distributions in Bayesian classifiers. In: Proceedings of the Eleventh Annual Conference on Uncertainty in Artificial Intelligence; 18–20 August 1995; Montreal, Canada. pp. 338-345.
  • Cohen WW. Fast effective rule induction. In: Proceedings of the Twelfth International Conference on Machine Learning 1995. pp. 115-123.
  • Kira K, Rendell L. The feature selection problem: traditional methods and a new algorithm. In: Proceedings of the Tenth National Conference on AI; 1992. pp. 129-134.
  • Malhi A, Gao R. PCA-based feature selection scheme for machine defect classification. IEEE T Instrum Meas 2004; 53: 1517-1525.
  • Kohavi R, John GH. Wrappers for feature subset selection. Artif Intell 1997; 97: 273-324.