Hostname: page-component-586b7cd67f-rdxmf Total loading time: 0 Render date: 2024-11-26T16:08:20.222Z Has data issue: false hasContentIssue false

Parity codes

Published online by Cambridge University Press:  15 March 2005

Paulo E. D. Pinto
Affiliation:
Universidade Estadual do Rio de Janeiro, Instituto de Matemática e Estatística, RJ, Brasil; [email protected]
Fábio Protti
Affiliation:
Universidade Federal do Rio de Janeiro, Instituto de Matemática and NCE, Caixa Postal 2324, 20001-970, Rio de Janeiro, RJ, Brasil; [email protected]
Jayme L. Szwarcfiter
Affiliation:
Universidade Federal do Rio de Janeiro, Instituto de Matemática, NCE and COPPE, Caixa Postal 2324, 20001-970, Rio de Janeiro, RJ, Brasil; [email protected]
Get access

Abstract

Motivated by a problem posed by Hamming in 1980, we define even codes. They are Huffman type prefix codes with the additional property of being able to detect the occurrence of an odd number of 1-bit errors in the message. We characterize optimal even codes and describe a simple method for constructing the optimal codes. Further, we compare optimal even codes with Huffman codes for equal frequencies. We show that the maximum encoding in an optimal even code is at most two bits larger than the maximum encoding in a Huffman tree. Moreover, it is always possible to choose an optimal even code such that this difference drops to 1 bit. We compare average sizes and show that the average size of an encoding in a optimal even tree is at least 1/3 and at most 1/2 of a bit larger than that of a Huffman tree. These values represent the overhead in the encoding sizes for having the ability to detect an odd number of errors in the message. Finally, we discuss the case of arbitrary frequencies and describe some results for this situation.

Keywords

Type
Research Article
Copyright
© EDP Sciences, 2005

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

N. Faller, An adaptive Method for Data Compression, in Record of the 7th Asilomar Conference on Circuits, Systems and Computers, Naval Postgraduate School, Monterrey, Ca. (1973) 593–597.
Gallager, R.G., Variations on a Theme by Huffman. IEEE Trans. Inform. Theory 24 (1978) 668674. CrossRef
R.W. Hamming, Coding And Information Theory. Prentice Hall (1980).
D.A. Huffman, A Method for the Construction of Minimum Redundancy Codes, in Proc. of the IRE 40 (1951) 1098–1101.
D.E. Knuth, The Art of Computer Programming. Addison Wesley (1973).
Knuth, D.E., Dynamic Huffman Coding. J. Algorithms 6 (1985) 163180. CrossRef
E.S. Laber, Um algoritmo eficiente para construção de códigos de prefixo com restrição de comprimento. Master Thesis, PUC-RJ, Rio de Janeiro (1997).
Larmore, L.L. and Hirshberg, D.S., A fast algorithm for optimal length-limited Huffman codes. JACM 37 (1990) 464473. CrossRef
Milidiu, R.L., Laber, E.S. and Pessoa, A.A., Improved Analysis of the FGK Algorithm. J. Algorithms 28 (1999) 195211. CrossRef
Milidiu, R.L. and Laber, E.S., The Warm-up Algorithm: A Lagrangean Construction of Length Restricted Huffman Codes. SIAM J. Comput. 30 (2000) 14051426. CrossRef
Milidiu, R.L. and Laber, E.S., Improved Bounds on the Inefficiency of Length Restricted Codes. Algorithmica 31 (2001) 513529. CrossRef
Turpin, A. and Moffat, A., Practical length-limited coding for large alphabets. Comput. J. 38 (1995) 339347. CrossRef
Pinto, P.E.D, Protti, F. and Szwarcfiter, J.L., Huffman-Based Error De, Atection Code, in Proc. of the Third International Workshop on Experimental and Efficient Algorithms (WEA 2004), Angra dos Reis, Brazil, 2004. Lect. Notes Comput. Sci. 3059 (2004) 446457. CrossRef
Schwartz, E.S., Optimum Encoding, An with Minimal Longest Code and Total Number of Digits. Inform. Control 7 (1964) 3744. CrossRef