No CrossRef data available.
Article contents
Bivariate distributions as saddle points of mutual information
Published online by Cambridge University Press: 14 July 2016
Abstract
Fix a bivariate distribution F on X × Y, considered as a pair (α, {Fx}), where α is a marginal distribution on X and {Fx} is a collection of conditional distributions on Y. For essentially every (β,{Gx}) satisfying a certain pair of moment conditions determined by (α, {Fx}), J(β, {Fx}) ≦ J(α, {Fx}) ≦ J(α, {Gx}), where J is mutual information. This relates to two sorts of extremizations of mutual information of relevance to communication theory and statistics.
Keywords
- Type
- Research Papers
- Information
- Copyright
- Copyright © Applied Probability Trust 1978
References
Bahadur, R. R. (1971) Some Limit Theorems in Statistics.
Regional Conference Series in Applied Mathematics, No. 4. SIAM, Philadelphia, Pa.
Google Scholar
Balakrishnan, A. V. (1968) Basic concepts of information theory. Chapter 5 of Communication Theory
, ed. Balakrishnan, A. V. et al., McGraw-Hill, New York.Google Scholar
Berger, A. (1951) Remark on separable spaces of probability measures. Ann. Math. Statist.
22, 119–120.Google Scholar
Csiszár, I. (1967) Information-type measures of differences of probability distributions. Studia Sci. Math. Hungar.
2, 299–318.Google Scholar
Fano, R. M. (1961) Transmission of Information, MIT Press, Cambridge, Mass. and Wiley, New York.Google Scholar
Kolmogorov, A. N. (1956) On the Shannon theory of information transmission in the case of continuous signals. IRE Trans. Inf. Theory IT-2, 102–108.Google Scholar
Lindley, D. V. (1956) On a measure of the information provided by an experiment. Ann. Math. Statist.
27, 986–1005.Google Scholar
Sethuraman, J. (1961) Some limit theorems for joint distributions. Sankyha
A 23, 379–386.Google Scholar