Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-22T04:12:25.906Z Has data issue: false hasContentIssue false

Minimum dynamic discrimination information models

Published online by Cambridge University Press:  14 July 2016

Majid Asadi*
Affiliation:
University of Isfahan
Nader Ebrahimi*
Affiliation:
Northern Illinois University
G. G. Hamedani*
Affiliation:
Marquette University
Ehsan S. Soofi*
Affiliation:
University of Wisconsin-Milwaukee
*
Postal address: Department of Statistics, University of Isfahan, Isfahan, 81744, Iran. Email address: [email protected]
∗∗Postal address: Division of Statistics, Northern Illinois University, DeKalb, IL 60155, USA. Email address: [email protected]
∗∗∗Postal address: Department of Mathematics, Statistics and Computer Science, Marquette University, PO Box 1881, Milwaukee, WI 53201-1881, USA. Email address: [email protected]
∗∗∗∗Postal address: School of Business Administration, University of Wisconsin-Milwaukee, PO Box 741, Milwaukee, WI 53201, USA. Email address: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distribution G is that which has least Kullback-Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residual moment growth inequalities, or hazard rate growth inequalities. Our results lead to MDDI characterizations of many well-known lifetime models and to the development of some new models. Dynamic information constraints that characterize these models are tabulated. A result for characterizing distributions based on dynamic Rényi information divergence is also given.

Type
Research Papers
Copyright
© Applied Probability Trust 2005 

References

Asadi, M., Ebrahimi, N., Hamedani, G. G. and Soofi, E. S. (2004). Maximum dynamic entropy models. J. Appl. Prob. 41, 379390.CrossRefGoogle Scholar
Belzunce, F., Navarro, J., Ruiz, J. M. and del Aguila, Y. (2004). Some results on residual entropy functions. Metrika 59, 147161.CrossRefGoogle Scholar
Di Crescenzo, A. and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Prob. 39, 434440.CrossRefGoogle Scholar
Ebrahimi, N. (1996). How to measure uncertainty in the residual lifetime distributions. Sankhyā A 58, 4857.Google Scholar
Ebrahimi, N. (1998). Testing for exponentiality of the residual lifetime based on dynamic Kullback–Leibler information. IEEE Trans. Reliab. 47, 197201.CrossRefGoogle Scholar
Ebrahimi, N. (2001). Testing for uniformity of the residual lifetime based on dynamic Kullback–Leibler information. Ann. Inst. Statist. Math. 53, 325337.CrossRefGoogle Scholar
Ebrahimi, N. and Kirmani, S. N. U. A. (1996a). A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions. Biometrika 83, 233235.CrossRefGoogle Scholar
Ebrahimi, N. and Kirmani, S. N. U. A. (1996b). Some results on ordering of survival functions through uncertainty. Statist. Prob. Lett. 29, 167176.CrossRefGoogle Scholar
Hamedani, G. G. (2005). Characterizations of univariate continuous distributions based on hazard functions. To appear in J. Appl. Statist. Sci. Google Scholar
Jaynes, E. T. (1957). Information theory and statistical mechanics. Physics Rev. 106, 620630.CrossRefGoogle Scholar
Jaynes, E. T. (1982). On the rationale of maximum entropy methods. Proc. IEEE 70, 939952.CrossRefGoogle Scholar
Kullback, S. (1959). Information Theory and Statistics. John Wiley, New York.Google Scholar
Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Symp. Math. Statist. Prob., Vol. 1, University of California Press, Berkeley, CA, pp. 547561.Google Scholar
Shaked, M. and Shanthikumar, J. G. (1994). Stochastic Orders and Their Applications. Academic Press, Boston, MA.Google Scholar
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Tech. J. 27, 379423, 623656.CrossRefGoogle Scholar
Shore, J. E. and Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and principle of minimum cross-entropy. IEEE Trans. Inf. Theory 26, 2637.CrossRefGoogle Scholar