Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-16T18:06:20.820Z Has data issue: false hasContentIssue false

INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS

Published online by Cambridge University Press:  07 February 2019

Fatemeh Asgari
Affiliation:
Department of Statistics,University of Isfahan, Isfahan, Iran E-mail: [email protected] and [email protected]
Mohammad Hossein Alamatsaz
Affiliation:
Department of Statistics,University of Isfahan, Isfahan, Iran E-mail: [email protected] and [email protected]
Nayereh Bagheri Khoolenjani
Affiliation:
Department of Statistics,University of Isfahan, Isfahan, Iran E-mail: [email protected] and [email protected]

Abstract

Considering the Gaussian noise channel, Costa [4] investigated the concavity of the entropy power when the input signal and noise components are independent. His argument was connected to the first-order derivative of the Fisher information. In real situations, however, the noise can be highly dependent on the main signal. In this paper, we suppose that the input signal and noise variables are dependent. Then, some well-known copula functions are used to define their dependence structure. The first- and second-order derivatives of Fisher information of the model are obtained. Then, by using these derivatives, we will generalize two inequalities based on the Fisher information and a functional that is closely associated to Fisher information for the case when the input signal and noise variables are dependent. We will also show that the previous results for the independent case are recovered as special cases of our result. Several applications are provided to support the usefulness of our results. Finally, the channel capacity of the Gaussian noise channel model with dependent signal and noise is studied.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1Arias-Nicolás, J.P., Fernández-Ponce, J.M., Luque-Calvo, P. & Suárez-Llorens, A. (2005). Multivariate dispersion order and the notion of copula applied to the multivariate t-distribution. Probability in the Engineering and Informational Sciences 19(3): 363375.Google Scholar
2Blachman, N. (1965). The convolution inequality for entropy powers. IEEE Transactions on Information Theory 11(2): 267271.Google Scholar
3Cheng, F. & Geng, Y. (2015). Higher order derivatives in Costa's entropy power inequality. IEEE Transactions on Information Theory 61(11): 58925905.Google Scholar
4Costa, M. (1985). A new entropy power inequality. IEEE Transactions on Information Theory 31(6): 751760.Google Scholar
5Cover, T.M. & Thomas, J.A. (2006). Elements of information theory. 2nd ed. New York: Wiley.Google Scholar
6Feller, W. (1968). An introduction to probability theory and its applications. Vol. 1, ed. 3. New York: Wiley.Google Scholar
7Fink, A.M. (1982). Kolmogorov-Landau inequalities for monotone functions. Journal of Mathematical Analysis and Applications 90(1): 251258.Google Scholar
8Frieden, B.R. (2004). Science from Fisher information: a unification. Cambridge: Cambridge University Press.Google Scholar
9Joe, H. (1997). Multivariate models and dependence concepts. In Cox, D.R. (ed.), Monographs on statistics and applied probability. Vol. 73. London: Chapman & Hall.Google Scholar
10Johnson, O. (2004). A conditional entropy power inequality for dependent variables. IEEE Transactions on Information Theory 50(8): 15811583.Google Scholar
11Kay, S. (2009). Waveform design for multistatic radar detection. IEEE Transactions on Aerospace and Electronic Systems 45(3): 11531166.Google Scholar
12Khoolenjani, N.B. & Alamatsaz, M.H. (2016). A de Bruijn's identity for dependent random variables based on copula theory. Probability in the Engineering and Informational Sciences 30(01): 125140.Google Scholar
13Lehmann, E.L. & Casella, G. (1998). Theory of point estimation. 2nd edn. New York: Springer.Google Scholar
14Rao, C.R. (1946). Information and the accuracy attainable in the estimation of statistical parameters. In Kotz, S. & Johnson, N.L. (eds.), Breakthroughs in statistics. New York: Springer, pp. 235247.Google Scholar
15Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 623656.Google Scholar
16Sklar, A. (1959). Fonctions de repartition an dimensions et leurs marges. Publications de l'Institut de Statistique de l' Universite de Paris 8: 229231.Google Scholar
17Stam, A.J. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information and Control 2(2): 101112.Google Scholar
18Takano, S., Watanabe, S., Fukushima, M., Prohorov, Y. & Shiryaev, A. (1995). The inequalities of Fisher information and entropy power for dependent variables. In Proceedings of the 7th Japan-Russia Symposium on Probability Theory and Mathematical Statistics, 460470.Google Scholar
19Toscani, G. (2015). A strengthened entropy power inequality for log-concave densities. IEEE Transactions on Information Theory 61(12): 65506559.Google Scholar
20Verdu, S. & Guo, D. (2006). A simple proof of the entropy-power inequality. IEEE Transactions on Information Theory 52(5): 21652166.Google Scholar
21Villani, C. (2000). A short proof of the concavity of entropy power. IEEE Transactions on Information Theory 46(4): 16951696.Google Scholar
22Weingarten, H., Steinberg, Y. & Shamai, S. (2006). The capacity region of the Gaussian multiple-input multiple-output broadcast channel. IEEE Transactions on Information Theory 52(9): 39363964.Google Scholar
23Zamir, R. (1998). A proof of the Fisher information inequality via a data processing argument. IEEE Transactions on Information Theory 44(3): 12461250.Google Scholar
24Zhang, X., Anantharam, V. & Geng, Y. (2018). Gaussian optimality for derivatives of differential entropy using linear matrix inequalities. Entropy 20(3): 182.Google Scholar