Next:Information and Divergence MeasuresUp:Entropy of a Continuous
Previous:Differential Entropy and Probability Go to:Table of Contents

Entropy-Power Inequality


Given a continuous random variable $ X$ with differential entropy $ H(X)$, the entropy power of $ X$ is given by 

$\displaystyle N_e={1\over 2\pi e}e^{2H(X)}.$

We have the following two properties.

Property 1.77. Let $ X$ and $ Y$ be independent continuous random variables of finite variance. Then the differential entropy satisfies

$\displaystyle e^{2H(X+Y)}\geq e^{2H(X)}+e^{2H(Y)},$
    (1.14)

with equality iff $ X$ and $ Y$ are gaussian.

Property 1.78. Let $ \underline{{\bfX}}=(X_1,...,X_N)$ and $ \underline{{\bf Y}}=(Y_1,...,Y_N)$ be two independent continuous vector valued random variables with finite variance. Then 

$\displaystyle e^{2H(\underline{{\bf X}}+\underline{{\bfY}})/N}\geq e^{2H(\underline{{\bf X}})/N}+e^{2H(\underline{{\bfY}})/N}.$
Note 1.8. The inequality (1.14) is famous in the literature as "Power inequality". Some studies on it can be seen in Stam (1959) [96], Blachman (1965) [12] and Costa (1985) [28].

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil