Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-09T07:07:19.174Z Has data issue: false hasContentIssue false

Recent results in information theory

Published online by Cambridge University Press:  14 July 2016

Samuel Kotz*
Affiliation:
University of Toronto

Extract

Information theory, in the strict sense, is a rapidly developing branch of probability theory originating from a paper by Claude E. Shannon in the Bell System Technical Journal in 1948,in which anew mathematical model ofcommunications systems was proposed and investigated.

One of the central innovations of this model was in regarding the prime components of a communications system (the source of messages and the communication channel) as probabilistic entities. Shannon also proposed a quantitative measure of the amount of information based on his notion of entropy and proved the basic theorem of this theory concerning the possi bility of reliable transmission of information over a particular class of noisy channels.

Type
Review Paper
Copyright
Copyright © Sheffield: Applied Probability Trust 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1] Abramson, N. (1960) A partial ordering for binary channels. IRE Trans. Information Theory 6, 529539.Google Scholar
[2] Aczel, J. (1964) Zur gemeinsamen Charakterisierung der Entropien a-ter Ordnung und der Shannonischen Entropie bei nicht unbedingt vollständigen Verteilungen. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 3, 177183.Google Scholar
[3] Aczel, J., Daroczy, Z. (1963) Charakterisierung der Entropien positiver Ordnung und der Shannonischen Entropie. Acta Math. Acad. Sci. Hungar. 14, 95121, IV, V, VI.CrossRefGoogle Scholar
[4] Adler, R. L. (1961) Ergodic and mixing properties of infinite memory channels. Proc. Amer. Math. Soc. 12, 924930.Google Scholar
[5] Ash, R. B. (1963) Capacity and error for a time-continuous Gaussian channel. Information and Control 6, 1427.Google Scholar
[6] Ash, R. B. (1964) Further discussion of a time-continuous Gaussian channel. Information and Control 7, 7883.Google Scholar
[7] Ash, R. B. (1965) A simple example of a channel for which the strong converse fails. IEEE Trans. Information Theory 11, 456457.CrossRefGoogle Scholar
[8] Bellman, R. and Kalaba, R. (1957) On the role of dynamic programming in statistical communication theory. IRE Trans. Information Theory 3, 197203.Google Scholar
[8a] Billingsley, P. (1961) On the coding theorem for the noiseless channel. Ann. Math. Statist. 32, 576601.CrossRefGoogle Scholar
[9] Birch, John J. (1962) Approximations for the entropy for functions of Markov chains. Ann. Math. Statist. 33, 930938.Google Scholar
[10] Birch, John J. (1963) On information rates for finite-state channels. Information and Control 6, 372380.Google Scholar
[11] Blachman, N. M. (1962) On the capacity of a band-limited channel perturbed by statistically dependent interference. IRE Trans. Information Theory 8, 1, 4855.CrossRefGoogle Scholar
[12] Blachman, N. M. (1962) The effect of statistically dependent interference upon channel capacity Ibid. 8, 5,5357.Google Scholar
[13] Blachman, N. M. (1965) The convolution inequality for entropy powers. Ibid. 11, 2, 267271.Google Scholar
[14] Blackwell, D. (1957) The entropy of function of finite state Markov chains. Trans, of the First Prague Conference on inform. Theory, Statist. Decision Functions and Random Processes, Prague, 1320.Google Scholar
[15] Blackwell, D. (1960) Infinite codes for memoryless channels. Ann. Math. Statist. 30, 12421244.Google Scholar
[16] Blackwell, D. (1961) Exponential error bounds for finite-state channels. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability 1, 5763. University of California Press, Berkeley.Google Scholar
[17] Blackwell, D., Breiman, L., Thomasian, A. J. (1958) Proof of Shannon's transmission theorem for finite-state indecomposable channels. Ann. Math. Statist. 29, 12091220.Google Scholar
[18] Blackwell, D., Breiman, L., Thomasian, A. J. (1959) The capacity of a class of channels Ann. Math. Statist. 30, 12291241.Google Scholar
[19] Blackwell, D., Breiman, L., Thomasian, A. J. (1960) The capacities of certain channel classes under random coding. Ann. Math. Statist. 31, 558567.Google Scholar
[20] Bongard, M. (1963) On the notion of useful information. Problemy Kibernet. 9, 71102 (in Russian).Google Scholar
[21] Brfiman, L. (1957) The individual ergodic theorem of information theory, Ann. Math. Statist. 28, 809811.CrossRefGoogle Scholar
[21a] Brfiman, L. (1960) A correction to the above paper. Ann. Math. Statist. 31, 809810.Google Scholar
[22] Brfiman, L. (1960) On achieving channel capacity in finite-memory channels. Illinois J. Math. 4, 246252.Google Scholar
[23] Brfiman, L. (1960) Finite-state channels, Trans. 2nd Prague Conf. Information Theory, Statist. Decision Functions, Random Processes. 4960, Academic Press N. Y. Google Scholar
[24] Brillouin, L. (1964) Scientific uncertainty and information. Academic Press, N. Y. Google Scholar
[25] Brown, T. A. (1963) Entropy and conjugacy. Ann. Math. Statist. 34, 226232.Google Scholar
[26] Campbell, L. L. (1965) Entropy as a measure. IEEE Trans. Information Theory 11, 112114.Google Scholar
[27] Campbell, L. L. (1965) A coding theorem and Rényi entropy, Information and Control 8, 423429.Google Scholar
[27a] Campbell, L. L. (1965) Definition of entropy by means of a coding problem (Manuscript).Google Scholar
[28] Carleson, L. (1958) Two remarks on the basic theorems of information theory. Scand. Math. 6, 175180.Google Scholar
[29] Carlyle, J. W. (1964) On the external probability structure of finite-state channels. Information and Control 7. 385397.Google Scholar
[30] Chang, T. T., and Lawton, J. G. (1962) Partial ordering of discrete channels. IRE Internat. Convention Record 10, 190199.Google Scholar
[31] Chang, T. T., and Lawton, J. G. (1964) On the comparison of communication channels. IRE Trans. Information Theory 10, 9798.Google Scholar
[32] Chung, K. L. (1961) A note on the ergodic theorem of information theory. Ann. Math. Statist. 32, 612614.CrossRefGoogle Scholar
[33] Csiszar, I. (1961) Some remarks on the dimension and entropy of random variables. Acta Math. Acad. Sci. Hungar. 12, 399408.Google Scholar
[34] Csiszar, I. (1962) Informationstheoretische Konvergenzbegriffe im Raum der Wahrscheinlichkeitsverteilungen. Magyar Tud. Akad. Mat. Kutato Int. Közl. 7, 137158.Google Scholar
[35] Csiszar, I. (1963) Eine informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markovschen Ketten. Magyar Tud. Akad. Mat. Kutato Int. Közl. 8, 85108.Google Scholar
[36] Csiszar, I. (1964) Über topologische und metrische Eigenschaften der relativen Information der Ordnung. Trans. Third Prague Conf. Information Theory, Statistical Decision Functions, Random Processes, Publ. House Czech. Acad. Sci., Prague, 6373.Google Scholar
[37] Csiszar, I. and Fischer, J. (1962) Informationsentfemungen im Raum der Wahrscheinlichkeitsverteilungen. Magyar Tud. Akad. Mat. Kutato Int. Közl., 7, 159180.Google Scholar
[38] Caregradski, I. P. (1958) On the capacity of a stationary channel with finite memory. Teor. Verojatnost. i Primenen. 3, 8496 (in Russian).Google Scholar
[39] Cybakov, B. S. (1961) Shannon scheme for Gaussian message with a uniform spectrum and channel with fluctuating noise. Radiotekhnika i Electronika, VI, 649651 (in Russian).Google Scholar
[40] Daróczy, Z. (1963) On the common characterization of Shannon's and Rényi's measures of entropy in incomplete distributions. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 1, 381388.CrossRefGoogle Scholar
[41] Daróczy, Z. (1964) Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen. Acta Math. Acad. Hungar. 15, 203210.Google Scholar
[42] Dobrushin, R. L. (1958) The transmission of information along a channel with feedback. Teor. Verojatnost. i Primenen. 3, 395412 (in Russian).Google Scholar
[43] Dobrushin, R. L. (1959) A general formulation of the fundamental Shannon theorem in information theory. Uspehi Mat. Akad. Nauk. 14, 3104 (in Russian).Google Scholar
[44] Dobrushin, R. L. (1959) General formulation of Shanon's basic theorems of the theory of information. Dokl. Akad. Nauk. SSSR 126, 474 (in Russian).Google Scholar
[45] Dobrushin, R. L. (1960) Approach to the limit under the sign of information and entropy. Teor. Verojatnost. i Primenen. 5, 2939 (in Russian).Google Scholar
[46] Dobrushin, R. L. (1961) Mathematical problems in Shannon theory of optimal coding of information. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability, 211252. University of California Press, Berkeley.Google Scholar
[47] Dobrushin, R. L. (1962) Optimal binary codes for small rates of transmission of information Teor. Verojatnost. i Primenen. 7, 208213 (in Russian).Google Scholar
[48] Dobrushin, R. L. (1962) Asymptotic bounds for the probability of error of information transmission over a discrete memoryless channel with a symmetric transition matrix. Teor. Verojatnost. i Primenen. 7, 283311 (in Russian).Google Scholar
[49] Dobrushin, R. L. (1962) An asymptotic bound for the probability of error of information transmission through a channel without memory using feedback. Problemy Kibernet. 8, 161168 (in Russian).Google Scholar
[50] Dobrushin, R. L. (1963) Asymptotic optimality for grouped and systematic codes for certain channels. Teor. Verojatnost. i Primenen. 8, 5266 (in Russian).Google Scholar
[51] Dobrushin, R. L. (1963) Unified methods of information for discrete channels without memory and for communications with independent components. Dokl. Akad. Nauk. SSSR 148, 12451248 (in Russian).Google Scholar
[52] Dobrushin, R. L. (1964) On the sequential method of Wozencraft-Reiffen. Problemy Kibernet. 12, 113123 (in Russian).Google Scholar
[53] Dobrushin, R. L. and Cybakov, B. S. (1962) Information transmission with additional noise. IRE Trans. Information Theory 5, 293304.Google Scholar
[54] Echigo, M. and Nakamura, M. (1962) A remark on the concept of channels. Proc. Japan Acad. 38, 307309.Google Scholar
[55] Eisenberg, E. (1963) On channel capacity. Internal Technical Memorandum M-35 Electronic Research Labs., University of California.Google Scholar
[56] Elias, P. (1955) Coding for two noisy channels. Proc. Third London Symp. on Information Theory, 6176. Butterworth Scientific Publications, London.Google Scholar
[57] Elias, P. (1963) Information theory and decoding computations. Proc. Symp. Appl. Math. XV, 5158. Amer. Math. Soc., Providence, R. I. Google Scholar
[58] Epstein, M. A. (1958) Algebraic decoding for a binary erasure channel, IRE Convention Record 4, 5659.Google Scholar
[59] Epstein, M. A. (1963) Improved computational bound for the binary erasure channel. IEEE Trans. Information Theory 9, 51.Google Scholar
[60] Erdös, P. (1946) On the distribution function of additive functions. Ann. of Math. 47, 120.Google Scholar
[61] Erdös, P. and Renyi, A. (1963) On two problems of information theory. Public. Math. Inst. Hungarian Academy of Sciences, Vol. VIII Series A. Fasc. 6, 1–2, 229243.Google Scholar
[62] Faddeev, D. K. (1958) On the concept of the entropy for a finite probability model. Uspehi Mat. Nauk. 11, 227231 (in Russian).Google Scholar
[63] Fano, R. M. (1961) Transmission of information. M.I.T. Press, Cambridge, Mass., John Wiley, New York, N.Y.Google Scholar
[64] Fano, R. M. (1963) A heuristic discussion of probabilistic decoding. IEEE Trans. Information Theory 9, 6474.CrossRefGoogle Scholar
[65] Feinstein, A. (1958) Foundations of Information Theory. McGraw-Hill, New York.CrossRefGoogle Scholar
[66] Feinstein, A. (1959) On the coding theorem, and its converse for finite-memory channels. Information and Control 2, 2544.Google Scholar
[67] Feinstein, A. (1954) A new basic theorem of information theory. IRE Trans. PGIT 1, 222.Google Scholar
[68] Feldman, J. (1958) Equivalence and perpendicularity of Gaussian processes. Pacific J. Math. 8, 699708.Google Scholar
[69] Ferguson, John D. (1964) Entropy for noninvertible transformations. Proc. Amer. Math. Soc. 15, 895898.CrossRefGoogle Scholar
[70] Fisher, R. A. (1925) Theory of statistical estimation. Proc. Cambridge Philos. Soc. 22, 700725.Google Scholar
[71] Fortet, R. (1961) Hypothesis testing and estimation for Laplacian functions. Fourth Berkeley Symposium on Mathematical Statistics and Probability 1, 289305.Google Scholar
[72] Fraser, D. A. S. (1965) On information in statistics, Ann. Math. Statist. 36, 890897.CrossRefGoogle Scholar
[73] Gallager, R. G. (1962) Low-density parity-check codes, IEEE Trans. Information Theory 8, 2128.Google Scholar
[73a] Gallager, R. G. (1963) A monograph under the same title. M.I.T. Press, Cambridge, Mass.Google Scholar
[74] Gallager, R. G. (1965) A simple derivation of the coding theorem and some applications. IEEE Trans. Information Theory, 11, 317.CrossRefGoogle Scholar
[75] Gelfand, I. M. (1955) Generalized random processes, Dokl. Akad. Nauk SSSR 100, 853856 (in Russian).Google Scholar
[76] Gelfand, I. M., Kolmogorov, A. N. and Yaglom, A. M. (1956) Towards a general definition of the quantity of information. Dokl. Akad. Nauk SSSR, 111, 745748 (in Russian).Google Scholar
[77] Gelfand, I. M., Kolmogorov, A. N. and Yaglom, A. M. (1958) The amount of information and entropy for continuous distributions. Proc. Third Math. Congress, Moscow, Izv. Akad. Nauk SSSR 3, 300320 (in Russian).Google Scholar
[78] Gelfand, I. M. and Yaglom, A. M. (1957) On the calculation of the quantity of information about a random function contained in another such function. Uspehi Mat. Nauk 12, 352, (in Russian).Google Scholar
[79] Goblick, T. J. (1962) Coding for a discrete information source with a distortion measure. PH.D. thesis, M.I.T. Google Scholar
[80] Good, I. J. (1963) Maximum entropy for hypothesis formulation, especially for multi-dimensional contingency tables. Ann. Math. Statist. 34, 911934.Google Scholar
[81] Guseva, O. V. (1964) Invariants of entropy type for automorphisms of a Lebesgue space. Vestnik Leningrad. Univ., Ser. Mat. Meh. Astronom. 19, 3641, (in Russian).Google Scholar
[82] Halmos, P. R. (1961) Recent progress in ergodic theory. Bull. Amer. Math. Soc. 67, 7080.Google Scholar
[83] Hajek, J. (1958) On a property of normal distributions of arbitrary stochastic processes. Czechoslovak Math. J. 8, 610618.Google Scholar
[84] Hu, Go-Din (1961) Three kinds of converses to Shannon's theorem in information theory. Acta Math. Sinica 11, (in Chinese); translated as Chinese Math. 2, (1963), 293–332.Google Scholar
[85] Hu, Go-Din (1962) Information stability of sequences of channels. Teor. Verojatnost. I Primenen. 7, 271282 (in Russian).Google Scholar
[86] Hu, Go-Din (1962) On information quantity. Teor. Verojatnost. i Primenen. 7, 447455 (in Russian).Google Scholar
[87] Hu, Go-Din (1964) On Shannon theorem and its converse for sequences of communication schemes in the case of abstract random variables. Trans. Third Prague Conf. Information Theory, Statist. Decision Functions and Random Processes 285332. Publ. House Czechoslovak Acad. Sci., Prague.Google Scholar
[88] Huang, R. Y. and Johnson, R. A. (1962) Information capacity of time-continuous channels. IRE Trans. Information Theory 8, 191198.Google Scholar
[89] Huang, R. Y. and Johnson, R. A. (1963) Information transmission with time-continuous random processes. IEEE Trans. Information Theory 9, 8489.Google Scholar
[90] Ibragimov, I. A. (1962) Some limiting theorems for stationary processes. Teor. Verojatnost. i Primenen. 7, 361392 (in Russian).Google Scholar
[91] Ingarden, R. S. (1963) Information theory and variational principles in statistical theories. Bull. Acad. Polon. Sci. Ser. Sci. Math. Astronom. Phys. 11, 541547.Google Scholar
[92] Ingarden, R. S. and Urbanik, K. (1962) Information without probability. Colloq. Math. 9, 131150.Google Scholar
[93] Ito, K. (1954) Stationary random distributions. Mem. College Sci. Univ. Kyoto Ser. A, 28, 209223.Google Scholar
[94] Jacobs, K. (1959) Die Übertragung direkter Informationen durch periodische und fastperiodische Kanäle, Math. Ann. 137, 125135.Google Scholar
[95] Jacobs, K. (1960) Über die Durchlasskapazität periodischer und fastperiodischer Kanäle. Trans. 2nd Prague Conf. Information Theory 231249. Publ. House Czechoslovak Akad. Sci., Prague.Google Scholar
[96] Jacobs, K. (1962) Über die Struktur der mittleren Entropie. Math. Z. 78, 3343.Google Scholar
[97] Jacobs, K. (1962) Über Kanäle vom Dichtetypus. Math. Z. 78, 151170.Google Scholar
[98] Jacobs, K. (1962) Almost periodic channels. Colloquium on combinatorial methods in probability theory. Matematisk Institut, Aarhus University, 1–10, 118136.Google Scholar
[99] Jacobs, K. (1962) Review of J. Wolfowitz's “Coding Theorems on Information Theory”. Jahresbericht der deutschen Mathematiker Vereinigung 65, 16.Google Scholar
[100] Jelinek, F. (1963) Loss in information transmission through two-way channel. Information and Control 6, 337371.Google Scholar
[101] Jelinek, F. (1964) Coding for and decomposition of two-way channels. IEEE Trans. Information Theory 10, 517.Google Scholar
[102] Jelinek, F. (1965) Indecomposable channels with side information at the transmitter. Information and Control 8, 3655.Google Scholar
[103] Karmazin, M. A. (1964) Solution of a problem of Shannon. Problemy Kibernet. 11, 263266 (in Russian).Google Scholar
[104] Kemperman, J. H. B. (1962) Studies in coding theory I. Manuscript—to be published in Illinois J. Math. Google Scholar
[105] Kendall, D. G. (1964) Functional equations in information theory. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 2, 225229.Google Scholar
[106] Kendall, D. G. (1964) Information theory and the limit-theorem for Markov chains and processes with a countable infinity of states. Ann. Inst. Statist. Math. XV, 137143.Google Scholar
[107] Kesten, H. (1961) Some remarks on the capacity of compound channels in the semicontinuous case. Information and Control 4, 169184.Google Scholar
[108] Khtnchin, A. I. (1957) Mathematical Foundations of Information Theory. Dover Publications, New York (English translation).Google Scholar
[109] Kiefer, J. and Wolfowitz, J. (1962) Channels with arbitrary varying channel probability functions. Information and Control 5, 4454.Google Scholar
[110] Kolmogorov, A. N. (1957) The theory of the transmission of information, 1956. Plenary session of the Academy of Sciences of the USSR on the automatization of production, Moscow, Izd. Akad. Nauk SSSR, 6699 (in Russian).Google Scholar
[111] Kolmogorov, A. N. (1959) Entropy per unit time as a metric invariant of automorphisms. Dokl. Akad. Nauk SSSR 124, 754755 (in Russian).Google Scholar
[112] Koshelev, V. N. (1963) Quantization with minimal entropy. Problemy Peredachi Informacii 14, 151156 (in Russian).Google Scholar
[113] Kotz, S. (1961) Exponential bounds on the probability of error for a discrete memoryless channel. Ann. Math. Statist. 32, 577582.Google Scholar
[114] Kotz, S. (1965) Some inequalities for convex functions useful in information theory. SIAM Review 7, 395402.Google Scholar
[115] Kullback, S. (1959) Information Theory and Statistics. J. Wiley, New York.Google Scholar
[116] Lee, P. M. (1964) On the axioms of information theory. Ann. Math. Statist. 35, 415417.Google Scholar
[117] Linkov, U. N. (1965) Computation of e-entropy of random variables for small e. Problemi Peredachi Informacii 1826 (in Russian).Google Scholar
[118] Hsi-Wen, Ma (1964) Feinstein's lemma for a finite system of channels. Chinese Math. 5, 316329.Google Scholar
[119] Shi-Sun, Mao (1965) Asymptotic of the optimal probability of error for the transmission of information in the channel without memory which is symmetrical of pairs of input symbols for small rates of transmission. Teor. Verojatnost. i Primenen. 10, 167175 (in Russian).Google Scholar
[120] Mcmillan, B. (1953) The basic theorems of information theory. Ann. Math. Statist. 24, 196219.Google Scholar
[121] Moy, Shu-Teh C. (1961) Generalizations of Shannon-McMILLAN theorem. Pacific J. Math. 11, 705714.Google Scholar
[122] Moy, Shu-Teh C. (1961) A note on generalizations of Shannon-McMillan theorem. Pacific J. Math. 11, 14591465.Google Scholar
[123] Muroga, S. (1953) On the capacity of a discrete channel I. J. Phys. Soc. Japan. 8, 484494.Google Scholar
[124] Muroga, S. (1956) On the capacity of a discrete channel II. J. Phys. Soc. Japan 11, 11091120.Google Scholar
[125] Nedoma, J. (1957) The capacity of a discrete channel. Trans. First Prague Confi Information Theory, Statistical Decision Functions and Random Processes, 143182. Prague.Google Scholar
[126] Nedoma, J. (1960) On non-ergodic channels. Trans. 2nd Prague Conf. Information Theory 363395. Prague.Google Scholar
[127] Nedoma, J. (1964) The synchronization for ergodic channels. Trans. 3rd Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 529539. Prague.Google Scholar
[128] Nedoma, J. (1963) Die Kapazität der periodischen Kanäle. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 2, 98110.Google Scholar
[129] Ovseevich, I. A. (1963) Capacities of a multipath system. Problemy Peredachi Informacii 4358 (in Russian).Google Scholar
[130] Ovseevich, I. A. and Pinsker, M. S. (1961) On the capacity of a multipath system. Izv. Akad. Nauk Energet. i Avtomat. 4, 208210 (in Russian).Google Scholar
[131] Parry, W. (1963) An ergodic theorem of information theory without invariant measure. Proc. London Math. Soc. 13, 605612.Google Scholar
[132] Parthasarathy, K. R. (1961) On the integral representation of the rate of transmission of a stationary channel. Illinois J. Math. 5, 299305.Google Scholar
[133] Parthasarathy, K. R. (1963) Effective entropy rate and transmission of information through channels with additive random noise. Sankhya, Indian J. Statist. A25, 7584.Google Scholar
[134] Parthasarathy, K. R. (1964) A note on McMillan's theorem for countable alphabets. Trans. 3rd Prague Conf. Information theory, Statist. Decision Functions and Random Processes, 541543. Prague.Google Scholar
[135] Perez, A. (1957) Notions généralisées d'incertitude, d'entropie et d'information du point de vue de la théorie de martingales. Trans. First Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 183208. Prague.Google Scholar
[136] Perez, A. (1957) Sur la théorie de l'information dans le cas d'un alphabet abstrait. Trans. First Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 209244. Prague.Google Scholar
[137] Perez, A. (1957) Sur la convergence des incertitudes, entropies et information échantillon vers leur valeurs vraies. Trans. First Prague Conf. Information Theory, Statist. Decision Functions and Random Process, 245252. Prague.Google Scholar
[138] Perez, A. (1959) Information theory with an abstract alphabet. Generalized aspects of McMillan's theorem for the case of discrete and continuous time. Teor. Verojatnost. i Primenen. 4, 105109 (in Russian).Google Scholar
[139] Perez, A. (1964) Extensions of Shannon-McMillan's limit theorem to more general stochastic processes. Trans. Third Prague Conf. Information Theory, Statist. Functions and Random Processes, 545574. Prague.Google Scholar
[140] Peterson, W. W. (1961) Error correcting codes, M.I.T. Press and John Wiley, N. Y. Google Scholar
[141] Peterson, W. W. and Massey, J. (1963) Coding theory. IEEE Trans. Information Theory 94, 223229.Google Scholar
[142] Pinsker, M. S. (1954) The quantity of information about a Gaussian random process contained in a second process which is stationary with respect to it. Dokl. Akad. Nauk SSSR 99, 213216 (in Russian).Google Scholar
[143] Pinsker, M. S. (1956) Amount of information about a stationary random process contained in another stationary process. Proc. Third All-Union Math. Conf., 3, Izd. Akad. Nauk SSSR (in Russian).Google Scholar
[144] Pinsker, M. S. (1956) The evaluation of the rate of creation of messages by a stationary random process and the capacity of a stationary channel. Dokl. Akad. Nauk SSSR 111, 753756 (in Russian).Google Scholar
[145] Pinsker, M. S. (1948) Extrapolation of vector random processes and the amount of information contained in one stationary vector random process relative to another which is stationarily correlated with it. Dokl. Akad. Nauk SSSR 121, 4951 (in Russian).Google Scholar
[146] Pinsker, M. S. (1960) Information stability of Gaussian random variables and processes. Dokl. Akad. Nauk SSSR 133, 2830 (in Russian).Google Scholar
[147] Pinsker, M. S. (1960) The entropy, the rate of establishment of entropy and entropic stability of Gaussian random variables and processes. Dokl. Akad. Nauk SSSR 133, 531534 (in Russian).Google Scholar
[148] Pinsker, M. S. (1960) Information and stability of random variables and processes. Moscow, Izd. Akad. Nauk SSSR (in Russian). (English translation, 1964).Google Scholar
[149] Pinsker, M. S. (1963) Sources of messages. Problemy Peredachi Informacii 14, 520 (in Russian).Google Scholar
[150] Pinsker, M. S. (1963) Gaussian sources. Problemy Peredachi Informacii 14, 59100 (in Russian).Google Scholar
[151] Pinsker, M. S. (1965) On decoding complexity. Problemy Peredachi Informacii I, 113116 (in Russian).Google Scholar
[152] Rajski, C. (1961) A metric space of discrete probability distributions. Information and Control 4, 371377.Google Scholar
[153] Rajski, C. (1963) On the normed information rate of discrete random variables. Zastosowania Matematyki, 6, 459462.Google Scholar
[154] Rajski, C. (1964) On the normed information rate of discrete random variables. Trans. Third Prague Conf Information Theory, Statist. Decision Functions, Random Processes 583585. Prague.Google Scholar
[154a] Rao Uppuluri, V.R. (1963) A strong converse to the coding theorem for continuous memoryless channels. Ph.D. Thesis (Part II) Indiana University.Google Scholar
[155] Ratner, M. E. (1965) Asymptotic of optimal probability of error in the transmission of information over a continuous memoryless erasure symmetric channel. Problemy Kibernet. 13, 115130 (in Russian).Google Scholar
[156] Reiffen, B. (1962) Sequential decoding for discrete input memoryless channels. IRE Trans. Information Theory 8, 203220.Google Scholar
[157] Rényi, A. (1959) On the dimension and entropy of probability distributions. Acta Math. Acad. Sci. Hung. 10, 193215.Google Scholar
[158] Rényi, A. (1959) Dimension, entropy and information. Trans. 2nd Prague Conf., 1960, 545556. Prague.Google Scholar
[159] Rényi, A. (1961) On measure of entropy and information. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability 1, 541561. University of California Press, Berkeley.Google Scholar
[160] Rényi, A. (1965) On the foundations of information theory (with discussion). Rev. Int. Statist. Inst. 33, 114.Google Scholar
[161] Rényi, A. (1965) On some basic problems of statistics from the point of view of information theory. Proc. Fifth Berkeley Symposiumon Mathematical Statistics and Probability, (in print) .Google Scholar
[162] Rényi, A. and Balatoni, J. (1957) Über den Begriff der Entropie. Math. Forsch. 117134.Google Scholar
[163] Rosenblatt-Roth, M. (1957) Entropy of stochastic processes. Dokl. Akad. Nauk SSSR 112, 1619.Google Scholar
[164] Rosenblatt-Roth, M. (1957) The theory of the transmission of information through statistical communication channels. Dokl. Akad. Nauk SSSR 112, 202205 (in Russian).Google Scholar
[165] Rosenblatt-Roth, M. (1960) The normalized e-entropy of sets and the transmission of information from continuous sources through continuous communication channels. Dokl. Akad. Nauk SSSR 130, 265268 (in Russian).Google Scholar
[166] Rosenblatt-Roth, M. (1960) Normed e-entropy of sets and theory of information transfer. Trans. 2nd Prague Conf. Information Theory, 569577. Prague.Google Scholar
[167] Rosenblatt-Roth, M. (1964) The notion of entropy in the theory of probabilities and its application in the theory of information transmission through noisy channels. Teor. Verojatnost i Primenen. 9, 246261 (in Russian).Google Scholar
[168] Rosenblatt-Roth, M. (1965) Approximations in information theory. Proc. Fifth. Berkeley Symposium on Mathematical Statistics and Probability (in print).Google Scholar
[169] Rosenblatt-Roth, M. (1965) An axiomatic approach to the differential entropy. Manuscript.Google Scholar
[170] Rozanov, Yu. A. (1962) On the density of one Gaussian measure with respect to another. Teor. Verojatnost. i Primenen. 82 (in Russian).Google Scholar
[171] Savage, J. E. (1963) Sequential decoding for an erasure channel with memory. Quarterly Progress Report No. 69. Res. Lab. of Electr. M.I.T. Google Scholar
[172] Sakaguchi, M. (1961) Some remarks on the capacity of a communication channel. j. Operations Res. Soc. Japan 3, 124132.Google Scholar
[173] Shannon, C. (1948) A mathematical theory of communication. Bell. System Tech. J. 27, 379423, 623656.Google Scholar
[174] Shannon, C. (1956) The zero capacity of a noisy channel. IRE Trans. Information Theory 2, 819.Google Scholar
[175] Shannon, C. (1957) Certain results in coding theory for noisy channels. Information and Control 1, 625.Google Scholar
[176] Shannon, C. (1958) Channels with side information at the transmitter. IBM J. Res. Devel. 289293.Google Scholar
[177] Shannon, C. (1958) A note on a partial ordering for communication channels. Information and Control 1, 390398.Google Scholar
[178] Shannon, C. (1959) Probability of error for optimal codes in a Gaussian channel. Bell System Tech. J. 38, 611655.Google Scholar
[179] Shannon, C. (1960) Coding theorems for a discrete source with a fidelity criterion. Information and Decision Processes, 93126. Machol, R. E., Ed., McGraw-Hill Book Company, New York, N. Y. Google Scholar
[180] Shannon, C. (1961) Two-way communication channels. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability. 1, 611644. University of California Press, Berkeley.Google Scholar
[181] Shih-Yi, Shen (1963) A necessary and sufficient condition for satisfaction of the information criterion in the Shannon theorem. Chinese Math. 3, 419438 (English translation).Google Scholar
[182] Shih-Yi, Shen (1964) The fundamental problem of stationary channels. Trans. Third Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 637639. Academic Press, New York.Google Scholar
[183] Sinai, Ya. (1959) On the concept of entropy for a dynamic system. Dokl. Akad. Nauk SSSR 124, 768771 (in Russian).Google Scholar
[184] Stiglitz, I. G. (1963) Sequential decoding with feedback. Ph. D. Thesis, M. I. T. Google Scholar
[185] Strassen, V. (1964) Asymptotische Abschätzungen in Shannon's Informationstheorie. Trans. 3rd Prague Conf. Information Theory, Statist. Decision Functions and Random Processes 689723. Academic Press, New York.Google Scholar
[186] Takano, K. (1957) On the basic theorems of information theory. Ann. Inst. Statist. Math. 9, 5377.Google Scholar
[187] Thomasian, A. J. (1960) An elementary proof of the AEP of information theory. Ann. Math. Statist. 31, 452456.Google Scholar
[188] Thomasian, A. J. (1960) Error bounds for continuous channels. Proc. 4th London Symp. Information Theory , 4660. Cherry, C., Ed. Butterworths, Washington, D.C. Google Scholar
[189] Tulcea, A. I. (1960) Contributions to information theory for abstract alphabets. Arkiv för Matematik 4, 235247.Google Scholar
[190] Tveberg, H. (1958) A new derivation of the information function. Math. Scand. 6, 297298.Google Scholar
[191] Umegaki, H. (1962) Entropy functionals in stationary channels. Proc. Japan Acad. 38, 668672.Google Scholar
[192] Umegaki, H. (1964) General treatment of alphabet-message space and integral representation of entropy. Kodai Math. Sem. Rep. 16, 1826.Google Scholar
[193] Umegaki, H. (1964) A functional method for stationary channels. Kodai Math. Sem. Rep. 16, 2739; 189190.Google Scholar
[194] Weiss, L. (1960) On the strong converse of the coding theorem for symmetric channels without memory. Quart. Appl. Math. 18, 209214.Google Scholar
[195] Winkelbauer, K. (1960) Communication channels with finite past history. Trans. 2nd Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 685831. Prague.Google Scholar
[196] Winkelbauer, K. (1964) On discrete information sources. Trans. 3rd Prague Conf. Information Theory, Statist. Decision Functions and Random Processes, 765830. Prague.Google Scholar
[197] Winograd, S. and Cowan, J. D. (1963) Reliable Computation in the Presence of Noise. M.I.T. Press.Google Scholar
[198] Wolfowitz, J. (1957) The coding of messages subject to chance errors. Illinois J. Math. 1, 591606.Google Scholar
[199] Wolfowitz, J. (1958) The maximum achievable length of an error correcting code. Illinois J. Math. 2, 454458.Google Scholar
[200] Wolfowitz, J. (1958) An upper bound on the rate of transmission of messages. Illinois J. Math. 2, 137141.Google Scholar
[201] Wolfowitz, J. (1958) Information theory for mathematicians. Ann. Math. Statist. 29, 351356.Google Scholar
[202] Wolfowitz, J. (1959) Strong converse of the coding theorem for semi-continuous channels. Illinois J. Math. 3, 477489.Google Scholar
[203] Wolfowitz, J. (1960) Contribution to information theory. Proc. Nat. Acad. Sci. U.S.A. 46, 557561.Google Scholar
[204] Wolfowitz, J. (1960) Simultaneous channels. Arch. Rational Mech. Anal. 4, 371386.Google Scholar
[205] Wolfowitz, J. (1960) Strong converse of the coding theorem for the general discrete finite-memory channel. Information and Control, 3, 8993.Google Scholar
[206] Wolfowitz, J. (1960) On coding theorems for general simultaneous channels. Trans. IRE. PGCT 4, 513516.Google Scholar
[207] Wolfowitz, J. (1961) Coding Theorems of Information Theory. Ergebnisse der Mathematik und ihrer Grenzgebiete. N.F., 31. Springer-Verlag, Berlin; Prentice-Hall, Englewood Cliffs, N. J.Google Scholar
[208] Wolfowitz, J. (1961) A channel with infinite memory. Proc. Fourth Berkeley Symposium on Mathematical Statistics and Probability 1, 763767. University of California Press, Berkeley.Google Scholar
[209] Wolfowitz, J. (1963) On channels without capacity. Information and Control 6, 4954.Google Scholar
[210] Wolfowitz, J. (1963) The capacity of an indecomposable channel. Sankhya, Indian J. Statist. A25, 101108.Google Scholar
[211] Wolfowitz, J. (1964) Coding theorems of information theory. Springer-Verlag, Berlin.Google Scholar
[212] Wolfowitz, J. (1956) The method of random codes for two-way channels without feedback (manuscript).Google Scholar
[213] Wolfowitz, J. (1965) Approximation with a fidelity criterion (manuscript).Google Scholar
[214] Wozencraft, J. M. (1957) Sequential decoding for reliable communication. IRE Convention Record, 5, 1125.Google Scholar
[215] Wozencraft, J. M. and Reiffen, B. (1961) Sequential Decoding. M.I.T. Press and John Wiley and Sons.Google Scholar
[216] Yaglom, A. M., and Yaglom, I.M. (1960) Probability and Information. Fizmatgiz, Moscow.Google Scholar
[217] Yahya, Q. (1963) Information theory and multiple particle production. Nuovo Cimento 1, 143150.Google Scholar
[218] Yoshihara, Ken-Ichi (1964) Simple proofs for the strong converse theorems in some channels. Kodai Math. Sem. Rep. 16, 213222.CrossRefGoogle Scholar
[219] Ziv, J. (1963) Successive decoding scheme for memoryless channels. IEEE Trans. Information Theory 9, 97104.Google Scholar
[220] Ziv, J. (1963) Coding and decoding for time-discrete amplitude-continuous memoryless channels. Technical Report 339, Research Laboratory of Electronics, M.I.T. Cambridge, Mass.Google Scholar
[221] Ziv, J. (1965) Probability of decoding error for random phase and Rayleigh fading channels. IEEE Trans. 11, 5361.Google Scholar