Performance comparison of new nonparametric independent component analysis algorithm for dfferent entropic indexes

Performance comparison of new nonparametric independent component analysis algorithm for dfferent entropic indexes

Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is directly evaluated from the original data samples, so it solves the important problem in ICA: how to choose nonlinear functions as the probability density function (pdf) estimation of the sources.

___

  • [1] A. Cichocki, S. Amari, Adaptive Blind Signal and Adaptive Blind Signal and Image Processing, Chichester, John Wiley & Sons, 2002.
  • [2] A. Hyvarinen, “Survey on independent component analysis”, Neural Computating Surveys, Vol. 2, pp. 94-128, 1999.
  • [3] R. Boscolo, H. Pan, V.P. Roychowdhury, “Independent component analysis based on non-parametric density estimation”, IEEE Transactions on Neural Networks, Vol. 15, pp. 55-65, 2004.
  • [4] A. Bell, T. Sejnowski, “An information maximization approach to blind source separation and blind deconvolution”, Neural Computing, Vol. 7, pp. 1129-1159, 1995.
  • [5] J.F. Cardoso, “High-order contrasts for independent component analysis”, Neural Computation, Vol. 11, pp. 157- 192, 1999.
  • [6] D.J. Krusienski, W.K. Jenkins, “Nonparametric density estimation based independent component analysis via particle swarm optimization”, ICASSP 2005, Vol. 4, pp. 357-360, 2005.
  • [7] C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics”, Journal of Statistical Physics, Vol. 52, pp. 479-487, 1988.
  • [8] C. Tsallis, Nonextensive Statistical Mechanics and Its Applications, edited by S. Abe and Y. Okamoto, Heidelberg, Springer Verlag, 2001.
  • [9] S. Abe, “Tsallis’ nonextensive statistical mechanics”, Mathematical Science, Vol. 38, pp. 71-80, 2000.
  • [10] A. Fachat, K.H. Hoffmann, A. Franz, “Simulated annealing with threshold accepting for Tsallis statistics”, Computer Physics Communications, Vol. 132, pp. 232-240, 2000.
  • [11] A. Capurro, L. Diambra, D. Lorenzo, O. Macadar, M.T. Martin, C.M. Mostaccio, A. Plastino, E. Rofman, M.E. Torres, J. Velutti, “Tsallis entropy and cortical dynamics: the analysis of EEG signals”, Physica A, Vol. 257, pp. 149-155, 1998.
  • [12] J.H. Havrda, F. Charvat, “Quantification methods of classification process: concepts of structural α entropy”, Kybernetika, Vol. 3, pp. 30-35, 1967.
  • [13] J.F. Cardoso, “Blind signal separation: statistical principles”, Proc. IEEE Special Issue on Blind Identification and Estimation, Vol. 9, pp. 2009-2025, 1998.
  • [14] F.R. Bach, M.I. Jordan, “Kernel independent component analysis”, Journal of Machine Learning Research, Vol. 3, pp. 1-48, 2002.
  • [15] A.K. Rajagopal, S. Abe, “Implications of form invariance to the structure of nonextensive entropies”, Physical Review Letters, Vol. 83, pp. 1711-1714, 1999.
  • [16] H. Suyari, “Nonextensive entropies derived from form invariance of pseudoadditivity”, Physical Review E, Vol. 65, 2002. DOI: 10.1103/PhysRevE.65.066118.
  • [17] S. Amari, A. Cichocki, H.H. Yang, A New Learning Algorithm for Blind Signal Separation, Advances in Neural Information Processing Systems, Vol. 8, Cambridge, MIT Press, 1996, pp. 757-763.
  • [18] A. Hyvarinen, “Fast and robust fixed-point algorithms for independent component analysis”, IEEE Transactions on Neural Networks, Vol. 10, pp. 626-634, 1999.
  • [19] T.M. Cover, J.A. Thomas, Elements of Information Theory, New York, John Wiley & Sons, 1991.
  • [20] R. Boscolo, H. Pan, V.P. Roychowdhury, “Non-parametric ICA”, Proceedings of the Third International Symposium on Independent Component Analysis and Blind Signal Separation (ICA2001), San Diego, California, 2001.
  • [21] M.C. Jones, The Projection Pursuit Algorithm for Exploratory Data Analysis, PhD thesis, School of Mathematics, University of Bath, 1983.
  • [22] B.W. Silverman, Density Estimation for Statistics and Data Analysis, New York, Chapman and Hall, 1985.
  • [23] N. Vlassis, Y. Motomura, “Efficient source adaptivity in independent component analysis”, IEEE Transactions on Neural Networks, Vol. 12, pp. 559-566, 2001.
  • [24] J. Karvanen, J. Eriksson, V. Koivunen, “Pearson system based method for blind separation”, Proceedings of Second International Workshop on Independent Component Analysis and Blind Signal Separation, Helsinki, pp. 585-590, 2000.
  • [25] H. Suyari, M. Dakemoto, “On independent component analysis using Tsallis mutual Entropy”, International Symposium on Nonlinear Theory and its Application, Xi’an, China, 2002.
  • [26] E.K.P. Chong, S.H. Zak, An Introduction To Optimization, New York, John Wiley & Sons, 2001.
  • [27] P. Comon, “Independent component analysis, a new concept?”, Signal Processing, Vol. 36, pp. 287-314, 1994.
  • [28] M. Rattray, G. Basalyga, “Scaling laws and local minima in Hebbian ICA”, in Advances in Neural Information Processing Systems 14 (T.G. Dietterich, S. Becker, Z. Ghahramani, Editors), Cambridge, MIT Press, pp. 495-501, 2001.
  • [29] M. Rattray, “Stochastic trapping in a solvable model of online independent component analysis”, Neural Compu tation, Vol. 14, pp. 421-435, 2002.
Turkish Journal of Electrical Engineering and Computer Sciences-Cover
  • ISSN: 1300-0632
  • Yayın Aralığı: Yılda 6 Sayı
  • Yayıncı: TÜBİTAK