Next:Relative Information and InaccuracyUp:Information and Divergence Measures
Previous:Information and Divergence Measures Go to:Table of Contents

Introduction


Kullback and Leibler (1951) [65] introduced a measure of information associated with two probability distributions of a discrete random variable. At the same time, they also developed the idea of Jeffrey's (1946) [49] invariant. Sibson (1969) [95] studied the idea of information radius generally referred as Jensen difference divergence measure. Taneja (1995) [108] presented a new divergence measure referring arithmetic and geometric mean divergence measure. On the other side Kerridge (1961) [63] studied an expression similar to Shannon's entropy associated with two probability distributions. This measure is generally referred as inaccuracy. During past years researchers interested towards one and two scalar parametric generalizations of the above four classical measures of information. These four measures have found deep applications toward statistics while the fifth one is new. Some continuous extensions are also studied. Some divergence measures like, Bhattacharyya distance, variational distance etc.. are also considered.

Unless otherwise specified, it is understood that $ 0\log0=0\log{0\over 0}=0$$ P=(p_1,p_2,...,$$ p_n) \in\Delta_n$$ Q=(q_1,q_2,...,q_n)\ \in \ \Delta_n$. If $ q_i=0$ for some $ i$, the corresponding $ p_i$ is also zero. All the logarithms are with base 2.
 


21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil