Journal of Applied Science and Engineering

Published by Tamkang University Press

1.30

Impact Factor

1.60

CiteScore

Satish Kumar This email address is being protected from spambots. You need JavaScript enabled to view it.1, Arun Choudhary2 and Rajesh Kumar3

1Department of Mathematics, College of Natural Sciences, Arba-Minch University, Arba-Minch, Ethiopia
2Department of Mathematics, Geeta Institute of Management & Technology, Kanipla-136131, Kurukshetra, Haryana, India
3Department of Mathematics, Hindu College, University of Delhi, Delhi-7, India


 

Received: April 12, 2012
Accepted: November 12, 2014
Publication Date: December 1, 2014

Download Citation: ||https://doi.org/10.6180/jase.2014.17.4.12  


ABSTRACT


A parametric mean length is defined as the quantity

where R > 0, 0 < α < 2, R+ α ≠ 2, β > 0 and ∑ pi = 1 . This being the mean length of code words. Lower and upper bounds for αL   are derived in terms of generalized R-norm information measure of type α.


Keywords: Codeword Length, Kraft Inequality, Holder’s Inequality, Optimal Code Length, R-Norm Information Measure.


REFERENCES


  1. [1] Boekee, D. E. and Van Der Lubbe, J. C. A., “The R-Norm Information Measure,” Information and Control, Vol. 45, pp. 136 155 (1980). doi: 10.1016/ S0019-9958(80)90292-2
  2. [2] Shannon, C. E., “A Mathematical Theory of Communication,” Bell System Tech. J., Vol. 27, pp. 379 423, 623 656 (1948). doi: 10.1002/j.1538-7305.1948. tb01338.x
  3. [3] Renyi, A., “On Measure of Entropy and Information,” Proc. 4th Berkeley Symp. Maths. Stat. Prob., Vol. 1, pp. 547 561 (1961).
  4. [4] Havrda, J. F. and Charvat, F., “Qualification Method of Classification Process, the Concept of Structural - Entropy,” Kybernetika, Vol. 3, pp. 30 35 (1967).
  5. [5] Daroczy, Z., “Generalized Information Function,” Information and Control, Vol. 16, pp. 36 51 (1970). doi: 10.1016/S0019-9958(70)80040-7
  6. [6] Hooda, D. S. and Ram, A., “Characterization of the Generalized R-Norm Entropy,” Accepted for Publication in Caribbean Journal of Mathematical and Computer Science, Vol. 8 (1998).
  7. [7] Arimoto, S., “Information Theoretical Consideration on Estimation Problems,” Information and Control, Vol. 19, pp. 181 199 (1971). doi: 10.1016/S0019- 9958(71)90065-9
  8. [8] Feinstein, A., Foundation of Information Theory, McGraw Hill, New York (1956).
  9. [9] Campbell, L. L., “A Coding Theorem and Renyi’s Entropy,” Information and Control, Vol. 8, pp. 423 429 (1965). doi: 10.1016/S0019-9958(65)90332-3
  10. [10] Kieffer, J. C., “Variable Lengths Source Coding with a Cost Depending Only on the Codeword Length,” Information and Control, Vol. 41, pp. 136 146 (1979). doi: 10.1016/S0019-9958(79)90521-7
  11. [11] Jelinek, F., “Buffer Overflow in Variable Length Coding of Fixed Rate Sources,” IEEE Transactions on Infformation Theory, Vol. 3, pp. 490 501 (1980). doi: 10.1109/TIT.1968.1054147
  12. [12] Hooda, D. S. and Bhaker, U. S., “A Generalized ‘Useful’ Information Measure and Coding Theorems,” Soochow J. Math., Vol. 23, pp. 53 62 (1997). doi: 10.1016/0020-0255(81)90037-2
  13. [13] Aczel, J. and Daroczy, Z., “Uber Verallegemeineste Quasiliniare Mittelveste Die Mit Grewinebts Functionen Gebildet SIND,” Pub. Math. Debrecan, Vol. 10, pp. 171 190 (1963).
  14. [14] Kapur, J. N., “Generalized Entropy of Order and Type ,” Maths. Seminar, Delhi, Vol. 4, pp. 78 94 (1967).
  15. [15] Gurdial and Pessoa, F., “On Useful Information of Order ,” J. Comb. Information and Syst. Sci., Vol. 2, pp. 30 35 (1977).
  16. [16] Khan, A. B., Bhat, B. A. and Pirzada, S., “Some Results on a Generalized Useful Information Measure,” Journal of Inequalities in Pure and Applied Mathematics, Vol. 6, No. 4, Art. 117 (2005).
  17. [17] Longo, G., “A Noiseless Coding Theorem for Sources Having Utilities,” SIAM J. Appl. Math., Vol. 30, pp. 739 748 (1976). doi: 10.1137/0130067
  18. [18] Parkash, Om and Sharma, P. K., “Noiseless Coding Theorems Corresponding to Fuzzy Entropies,” Southeast Asian Bulletin of Mathematics, Vol. 27, pp. 1073 1080 (2004). doi: 10.1109/FUZZ.2001.1007276
  19. [19] Singh, R. P., Kumar, R. and Tuteja, R. K., “Application of Hölder’s Inequality in Information Theory,” Information Sciences, Vol. 152, pp. 145 154 (2003). doi: 10.1016/S0020-0255(02)00300-6
  20. [20] Kumar, S. and Choudhary, A., “Some Coding Theorems on Generalized Havrda-Charvat and Tsallis’s Entropy,” Tamkang Journal of Mathematics, Vol. 43, pp. 437 444 (2012). doi: 10.5556/j.tkjm.43.2012.437-444
  21. [21] Mitter, J. and Mathur, Y. D., “Comparison of Entropies of Power Distribution,” ZAMM, Vol. 52, pp. 239 240 (1972). doi: 10.1002/zamm.19720520406
  22. [22] Shisha, O., Inequalities, Academic Press, New York (1967).
  23. [23] Beckenbach, E. F. and Bellman, R., Inequalities, Springer, Berlin (1961). doi: 10.1007/978-3-642-64971-4