تعداد نشریات | 27 |
تعداد شمارهها | 366 |
تعداد مقالات | 3,243 |
تعداد مشاهده مقاله | 4,754,970 |
تعداد دریافت فایل اصل مقاله | 3,245,421 |
Pseduo-inequality application in coding theory using δ-norm inaccuracy measure | ||
Journal of Hyperstructures | ||
دوره 8، شماره 2، اسفند 2019، صفحه 94-103 اصل مقاله (86.73 K) | ||
نوع مقاله: Research Paper | ||
شناسه دیجیتال (DOI): 10.22098/jhs.2019.2616 | ||
نویسنده | ||
Litegebe Wondie Alamirew* | ||
Department of Mathematics, University of Gondar, P.O.Box +251196, Gondar, Ethiopia | ||
چکیده | ||
In this paper we prove two pseudo-generalizations of Shannon inequality for the case of norm Inaccuracy Measure and norm entropy. Further, we establish a result on noiseless coding theorem for the proposed mean code length interms of generalized inaccuracy measure. | ||
کلیدواژهها | ||
Shannon inequality؛ δ-norm inaccuracy؛ δ-norm entropy؛ Codeword length؛ Kerridges inaccuracy | ||
مراجع | ||
[1] J. Aczel and Z. Darocczy, On Measures of Information and Their Characteriza-tions, Academic, New York(1975). [2] C. Arndt, Information Measure-Information and its Description in Science and Engineering, Springer: Berlin (2001). [3] E.F. Beckenbach and R. Bellman, Inequalities, Springer: Berlin (1961). [4] D. E. Boekee and J. C. A. Van Der Lubbe, The R-Norm Information Measure,Information and Control, 45(1980), 136-155. [5] L.L. Campbell, A coding theorem and Renyis entropy, Information and Control,8 (1965), 423-429. [6] Dagmar Markechova et.al., R-norm entropy and R-norm Divergence in fuzzy probability spaces, Entropy,20(4)(2018),272. [7] A. Feinstein, Foundation of Information Theory, McGraw Hill, New York (1956). [8] Gur Dial, On a Coding Theorems Connected with Entropy of order α and type β, Information Sciences, 30(1983), 55-65. [9] D.S. Hooda and D.K. Sharma, Generalized R-norm Information Measures, J.Appl. Math.Stat.Inform., 4(2008), 153-168. [10] D.F. Kerridge, Inaccuracy and Inference, Jour. Roy. Statist. Soc. Ser B, 23(1961),184-196. [11] L. Wondie and S. Kumar, Some Inequalities in Information Theory Using Tsal-lis Entropy, International Journal of Mathematics and Mathematical Sciences,2018, 4 pages. [12] Om Parkash and Priyanka Kakkar, Development of two mean codeword lengths, Information Science, 207(2012), 92-97. [13] Satish Kumar and Arun Choudhary, Some More Noiseless Coding Theorem on Generalized R-Norm Entropy, Journal of Mathematics Research, 3(1)(2011), 125- 130. [14] Satish Kumar, Arun Choudhary and Arvind Kumar, Some Source Coding Theorems and 1:1 Coding Based on Generalized Inaccuracy Measure of Order α and Type β, Commun. Math. Stat., DOI 10.1007/s 40304-014-0032-z 2(2014), 125-138. [15] C.E. Shannon, A Mathematical Theory of Communication, Bell System Tech. J. ,27(1948), 379-423, 623-656. [16] C. Tsallis, Possible Generalization of Boltzmann Gibbs Statistics, J. Stat. Phy., 52(1988), 479. [17] J. C. A. Van der Lubbe, On certain coding theorems for the information of order α and type β,In Information Theory, Statistical Functions, Random Pro-cesses,Transactions of the Eighth Prague Conference, C.Prague(1978), 253-266. | ||
آمار تعداد مشاهده مقاله: 45 تعداد دریافت فایل اصل مقاله: 52 |