Renyi entropy in continuous case is not the limit of discrete case

Shannon entropy is the creation of Shannon (1948) based on the experiences in Bell System Company during and after the Second World War. Then, Renyi (1961) generalized it for one parameter families of entropies. This entropy for discrete random variables is non-negative but it can be negative in continuous case. In this paper, we show that Renyi entropy for continuous random variables is not equal to the limit of it for discrete random variables. Also, some notes are derived in view of the variate versions of entropy criteria. 

___

  • [1] Aczel, J. Forte, B. and Ng, C. T., Why Shannon and Hartley entropies are "natural". Adv. Appl. Probab. (1970), 6, 131-146.
  • [2] Campbell, L.L., Exponential entropy as a measure of extent of a distribution. Zeitschr. fur Wahrsch. und verw. Geb. (1966), 5, 217-255.
  • [3] Cover, T. M. and Thomas, J. A., Elements of Information Theory. Second Edition. Wiley Interscience. (2006).
  • [4] Daroczy, Z. and Maksa, Gy., Non-negative information functions in Analytic Function Methods in Probability and Statistics. Colloq. Math. Soc. J. Bolyai 21, Gyires, B., Ed; north Holand: Amesterdam. (1979), 65-76.
  • [5] Diderrich, G., The role of boundedness in characterizing Shannon entropy. Information and Control. (1975), 29, 140-161.
  • [6] Fadeev, D.K., On the concept of entropy of a finite probability scheme (in Russian). Uspehi Mat. Nauk. (1956), 11, 227-231.
  • [7] Hartley, R. V. L., Transmission of information. Bell System Technical Journal. (1928), 7, 535-563.
  • [8] Harvda, J. and Charvat, F., Quantification method of classification processes: Concept of structural α−entropy. Kybernetika. (1967), 3, 30-35.
  • [9] Koski, T. and Persson, L.E., Some properties of generalized exponential entropies with applications to data compression. Information Theory. 50 (1992), 6, 1220-1228.
  • [10] Lee, P. M., On the axioms of information theory. Ann. Math. Statist. (1964) 35, 415-418.
  • [11] Renyi, A., On measures of entropy and information. Proc. Berekeley Symposium, Statist. Probability. (1961), 1, 547-561.
  • [12] Shannon, C. E., A mathematical theory of communication. Bell System Technical Journal., (1948), 27, 379-423.
  • [13] Tsallis, C., Possible generalizations of Boltzmann-Gibbs statistics. Journal of Statistical Physics. (1988), 52, 479-487.
  • [14] Tverberg, H., A new derivation of the information function. Math. Scand. (1958), 6, 297-298.