IJSREG Trion Studio

No Publication Cost

Vol 6, No 1 :

openaccess

On Characterization of Joint and Conditional Exponential Survival Entropies
D. S. Hooda , D. K. Sharma
Abstract

In the present communication the multivariate survival function of a random variable is used to define four new classes of exponential survival entropies and their particular cases. Joint and conditional exponential survival entropies for two non-negative random variables are defined and characterized. The generalized exponential survival entropies for some families of continuous distributions are also derived and studied in brief.
Full Text
PDF
References

B. Arnold; D. Strauss; Bivariate distributions with exponential conditions, J. Amer. Statist. Assoc. 83 522-527 (1998).
J-F. Bercher; C. Vignat; A Renyi entropy convolution inequality with application, presented at Proc. XI Eur. Signal Processing Conf.[on-line]http:www-syscom.univmlv.fr/~vignat/signal/index.html.(2002).
T.M.Cover; J.A. Thomas; Elements of information theory, New York, Wiley(1991).
N. Ebrahimi; How to measure uncertainty in the residual life time distribution, Sankhya, A, 58 48-56 (1996) .
J.Havrda; F.Charvat; Quantification Method of Classification Processes Concept of Structural a-Entropy, Kybernetika, 3 30-35 (1967).
D.S.Hooda; D.K. Sharma; Exponential survival entropies and their properties, Advances in Mathematical Sciences and Applications, 20 265-279 (2010).
D.S. Hooda; P. K.; Generalized residual entropies in survival analysis, J. of Applied Probability & Statistics, 2 241-249 (2007).
N.L.Jonhnso; S. Kotz; N.Balakrishnan; Continuous univariate distributions, 2nd ed. New York: Wiley, 2 (1995).
K. O; Some Properties of the Cumulative Residual Entropy, International Journal of Multidisciplinary Academic Research, 3(1) 54-58 (2015).
S.Kotz; N.Balakrishnan; N.L. Johnson; Continuous multivariate distributions, 2nd ed. New York, Wiley, Models and Applications 1 (2000).
P.Kumar; D. S. Hooda; On generalized measures of entropy and Dependence, Mathematical Slovaca, 58 (2008).
D.D.Mari; S.Kotz, Correlation and dependence, London, U.K: Imperial College Press (2001).
F. Misagh; On Shift-Dependent Cumulative Entropy Measures, International Journal of Mathematics and Mathematical Sciences, Research Article ID 7213285, 2016 (2016), http://dx.doi.org/10.1155/2016/7213285.
G. Psarrakos; A. Toomaj; On the generalized cumulativ residual entropy with applications in actuarial science, Journal of Computational and Applied Mathematics, 309 186-199 (2017).
M.Rao; Y.Chen; B.C. Vemuri; F.Wang; Cumulative residual entropy, A new measure of information, IEEE Trans. Inf. Theory 50(6) 1220-1228 (2004).
A.Renyi; On measures of entropy and information, Proc. 4 Bearkly Symposium on Stat. and Probability, University of California Press, 547-561 (1961).
C.E. Shannon; A mathematical theory of communication, Bell System Tech.Jr, 27 379-423, 623-659 (1948) .
B.D.Sharma; D.P.Mittal; New non-additive measures of entropy, J.Math. Science, 10 122-133 (1975).
B.D.Sharma; I.J. Taneja; Entropy of type and other generalized measure in information theory, Metrica, 22, 205-215 (1977).
E.S.Soofi; Principal information theoretic approaches, J. of American statistical association, 95 1344-1353 (2000).
I.Vajda; Theory of statistical inference and information. Norwell, MA, Kluwer (1989).
F.Wang; B.C. Vemuri; M. Rao; Y. Chen; Cumulative residual entropy, A new measure of information and its applications to image alignment, Proc. 9th IEEE Int. Conf. Computer Vision, 548-553 (2003).

ISSN(P) 2350-0174

ISSN(O) 2456-2378

Journal Content
Browser