Hostname: page-component-78c5997874-j824f Total loading time: 0 Render date: 2024-11-09T13:48:40.403Z Has data issue: false hasContentIssue false

Channels without synchronization

Published online by Cambridge University Press:  01 July 2016

R. Ahlswede
Affiliation:
Ohio State University and University of Illinois
J. Wolfowitz
Affiliation:
University of Illinois

Abstract

Let X = {1, …, a} and Y = {1, …, a} be the input and output alphabets, respectively. In the (unsynchronized) channels studied in this paper, when an element of X is sent over the channel, the receiver receives either nothing or a sequence of k letters, each a member of Y, where k, determined by chance, can be 1, or 2, or … or L, a given integer. The channel is called unsynchronized because the sequence received (for each letter sent) is not separated from the previous sequence or the following sequence, so that the receiver does not know which letters received correspond to which letter transmitted.

In Sections 1 and 2 we give the necessary definitions and auxiliary results. In Section 3 we extend the results of Dobrushin [2] by proving a strong converse to the coding theorem1 and making it possible to compute the capacity to within any desired accuracy.

In Section 4 we study the same channel with feedback, prove a coding theorem and strong converse, and give an algorithm for computing the capacity.

In Section 5 we study the unsynchronized channel where the transmission of each word is governed by an arbitrary element of a set of channel probability functions. Again we obtain the capacity of the channel, prove a coding theorem and strong converse, and give an algorithm for computing the capacity.

In Section 6 we apply results of Shannon [4] and supplement Dobrushin's results on continuous transmission with a fidelity criterion.

Type
Research Article
Copyright
Copyright © Applied Probability Trust 1971 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1] Ahlswede, R. (1968) Beiträge zur Shannonschen Informationstheorie im Falle nichtstationärer Kanale. Z. Wahrscheinlichkeitsth. 10, 142.CrossRefGoogle Scholar
[2] Dobrushin, R. L. (1967) Shannon's theorems for channels with synchronization errors. Problemy Peredači Informacii 3, 1836. (Problems of Information Transmission 11–26).Google Scholar
[3] Golomb, S. W., Davey, J. R., Reed, J. S., Trees, H. L. and Stiffler, J. J. (1963) Synchronization. IEEE Trans. Comm. Syst. 11, 481491.Google Scholar
[4] Shannon, C. E. (1960) Coding theorems for a discrete source with a fidelity criterion. Information and Decision Processes. Ed. by Machol, Robert E.. McGraw-Hill, New York.Google Scholar
[5] Wolfowitz, J. (1957) The coding of messages subject to chance errors. Illinois J. Math. 1, 591606.Google Scholar
[6] Wolfowitz, J. (1959) Simultaneous channels. Arch. Rational Mech. Anal. 4, 371386.Google Scholar
[7] Wolfowitz, J. (1965) Approximation with a fidelity criterion. Proc. Fifth Berkeley Symp. on Math. Statist. and Prob. 1, 565573.Google Scholar
[8] Wolfowitz, J. (1964) Coding Theorems of Information Theory. Springer, Berlin–Heidelberg–New York.Google Scholar
[9] Muroga, S. (1953) On the capacity of a discrete channel. Research and Development Data No. 5, Nippon Telegraph and Telephone Public Corporation, Tokyo.Google Scholar