Next:Minimum Relative Information PrincipleUp:Information and Divergence
Previous:Divergence Measures Go to:Table of Contents

Continuous Relative Information


Let $ X$ be a continuous random variable with probability density function $ p(x)$ and an estimated probability density function $ q(x)$, the measure of relative information of$ X$ is given by

$\displaystyle D(p\vert\vert q)=\int_{I\!\!R}{p(x)\,\,\ell n {\frac{p(x)}{q(x)}} \ dx},$
    (2.11)
whenever the integral exists.

We have seen in Chapter 1 that the Shannon's entropy in the continuous case is not invariant under a change of variables, while this is not so for the measure $ D(p\vert\vert q)$.


Subsections

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil