Some Source Coding Theorems and 1:1 Coding Based on Generalized Inaccuracy Measure of Order $\alpha $ and Type $\beta $

Satish Kumar , Arun Choudhary , Arvind Kumar

Communications in Mathematics and Statistics ›› 2014, Vol. 2 ›› Issue (2) : 125 -138.

PDF
Communications in Mathematics and Statistics ›› 2014, Vol. 2 ›› Issue (2) : 125 -138. DOI: 10.1007/s40304-014-0032-z
Article

Some Source Coding Theorems and 1:1 Coding Based on Generalized Inaccuracy Measure of Order $\alpha $ and Type $\beta $

Author information +
History +
PDF

Abstract

In this paper, we have established some noiseless coding theorems for a generalized parametric ‘useful’ inaccuracy measure of order $\alpha $ and type $\beta $ and generalized mean codeword length. Further, lower bounds on exponentiated useful code length for the best 1:1 code have been obtained in terms of the useful inaccuracy of order $\alpha $ and type $\beta $ and the generalized average useful codeword length.

Keywords

Generalized inaccuracy measures / Mean codeword length / Holder’s inequality

Cite this article

Download citation ▾
Satish Kumar, Arun Choudhary, Arvind Kumar. Some Source Coding Theorems and 1:1 Coding Based on Generalized Inaccuracy Measure of Order $\alpha $ and Type $\beta $. Communications in Mathematics and Statistics, 2014, 2(2): 125-138 DOI:10.1007/s40304-014-0032-z

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Aczel J, Daroczy Z. Uber Verallgemeinerte quasilineare mittelwerte, die mit Gewichtsfunktionen gebildet sind. Publ. Math. Debrecen. 1963, 10 171-190

[2]

Baer MB. Redundancy-related bounds for generalized Huffman codes. IEEE Trans. Inf. Theory. 2011, 57 2278-2290

[3]

Belis M, Guiasu S. A qualitative-quantitative measure of information in cybernetics systems. IEEE Trans. Inf. Theory. 1968, 14 593-594

[4]

Bhatia PK. Useful inaccuracy of order $\alpha $ and 1.1 coding. Soochow J. Math.. 1995, 21 1 81-87

[5]

Bhatia PK, Taneja HC, Tuteja RK. Inaccuracy and 1:1 code. Microelectron. Reliab.. 1993, 33 6 905-907

[6]

Campbell LL. A coding theorem and Renyi’s entropy. Inf. Control. 1965, 8 423-429

[7]

Chapeau-Blondeau F, Delahaies A, Rousseau D. Source coding with Tsallis entropy. Electron. Lett.. 2011, 47 187-188

[8]

Chapeau-Blondeau F, Rousseau D, Delahaies A. Renyi entropy measure of noise aided information transmission in a binary channel. Phys. Rev. E. 2010, 81 051112(1)-051112(10)

[9]

Guiasu S, Picard CF. Borne infericutre de la Longuerur utile de certain codes. C.R. Acad. Sci. Paris. 1971, 273 A 248-251

[10]

Gurdial, Pessoa F. On useful information of order $\alpha $. J. Comput. Inf. Syst. Sci.. 1977, 2 158-162

[11]

Hooda DS, Ram A. Bounds on $L_{1:1}^{\beta } (t)$ in terms of a generalized measure of entropy. Korean J. Comput. Appl. Math.. 1998, 5 1 201-212

[12]

Hooda DS, Sharma DK. Generalized useful information generating functions. J. Appl. Math. Inf.. 2009, 27 3–4 591-601

[13]

Jelinek F. Buffer overflow in variable lengths coding of fixed rate sources. IEEE. 1980, 3 490-501

[14]

Kapur JN. Generalized entropy of order $\alpha $ and type $\beta $. Math. Semin. Delhi. 1967, 4 78-94

[15]

Kapur JN. Entropy and Coding. 1998 New Delhi: MSTS

[16]

Kerridge DF. Inaccuracy and inference. J. R. Stat. Soc. Ser. B. 1961, 23 184-194

[17]

Khan AB, Ahmad H. Some noiseless coding theorems of entropy of order $\alpha $ of the power distribution $P^{\beta }$. Metron. 1981, 39 3–4 87-94

[18]

Kieffer JC. Variable lengths source coding with a cost depending only on the codeword length. Inf. Control. 1979, 41 136-146

[19]

Koski T, Persson LE. Some properties of generalized exponential entropies with applications to data compression. Inform. Sci.. 1992, 62 103-132

[20]

Kraft, L.G.: A Device for Quantizing Grouping and Coding Amplitude Modulated Pulses. M.S. Thesis, Electrical Engineering Department, MIT (1949)

[21]

Kumar S, Choudhary A. Some Coding theorems on generalized Havrda–Charvat and Tsallis’s Entropy. Tamkang J. Math.. 2012, 43 2 437-444

[22]

Leung-Yan-Cheong SK, Cover T. Some equivalence between Shannon entropy and Kolmogrov complexity. IEEE Trans. Inform. Theory. 1978, 24 331-338

[23]

Longo G. Quantitative-Qualitative Measure of Information. 1972 New York: Springer Verlag

[24]

Luo MX, Yang YX, Wang LC, Niu XX. Secure network coding in the presence of eavesdroppers. Sci. China Inform. Sci.. 2011, 53 648-658

[25]

Nath P. On a coding theorem connected with Renyi’s entropy. Inf. Control. 1975, 29 234-242

[26]

Neill WDO. An application of Shannon’s coding theorem to information transmission in economic markets. Inf. Sci.. 1987, 41 171-185

[27]

Parkash Om, Kakkar P. Development of two new mean codeword lengths. Inf. Sci.. 2012, 207 90-97

[28]

Ramamoorthy A. Minimum cost distributed source coding over a network. IEEE Trans. Inf. Theory. 2011, 57 461-475

[29]

Renyi, A.: On Measure of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability 1, 547–561 (1961)

[30]

Shannon CE. A mathematical theory of communication. Bell Syst. Technol. J.. 1948, 27 379–423 623-656

[31]

Sharma BD, Raina N. Coding theorem for partial received information. Inf. Sci.. 1980, 20 181-189

[32]

Shisha O. Inequalities. 1967 New York: Academic Press

[33]

Singh RP, Kumar R, Tuteja RK. Applications of holder’s inequality in information theory. Inf. Sci.. 2003, 152 145-154

[34]

Taneja IJ. A short note on exponentiated mean codeword length for the best $1:1$ code. Comput. Appl. Math.. 1984, 3 199-204

[35]

Taneja HC, Tuteja RK. Characterization of quantitative measure of inaccuracy. Kybernetika. 1986, 22 5 393-402

[36]

Theil H. Economics and Information Theory. 1967 Amsterdam: North Holland

[37]

Tu GF, Liu JJ, Zhang C, Gao SS, Li SD. Studies and advances on joint source-channel encoding/decoding techniques in flow media communication. Sci. China Inf. Sci.. 2011, 54 1883-1894

[38]

Wu WR, Tu J, Tu GF, Gao SS, Zhang C. Joint source channel VL coding/decoding for deep space communication networks based on a space trellis. Sci. China Inf. Sci.. 2010, 53 1-17

AI Summary AI Mindmap
PDF

151

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/