Next:Entropy of a ContinuousUp:Entropy Rate and Entropy
Previous:Entropy Rate Go to:Table of Contents

Entropy Series


This subsection deals with a measure of entropy commonly referred as entropy series, where in the probability distribution we have $ n \to \infty$. This quantity is given by 

$\displaystyle H(P_{\infty})=-\sum_{n=1}^{\infty}{p_n\log p_n},$
where $ P_{\infty}=(p_1,p_2,...,p_n,...)$ with $ p_n>0$ and$ \sum_{n=}^{\infty}{p_n}=1$.

Let $ a_n={1\over n\log n}$ and $ b_n={1\over n(\logn)^2}, \ n \geq 2$ be two sequences. Then it can easily be checked that the series $ \sum_{n=1}^{\infty}{a_n}$ diverges and the series$ \sum_{n=1}^{\infty}{b_n}$ converges. Let$ \theta=\sum_{n=1}^{\infty}{b_n}$ be the sum of the series. Consider $ p_n=b_n/\theta$, then 

$\displaystyle p_n \log p_n ={1\over n\logn}+{2\log \log n\over n(\log n)^2}.$
In view of the fact that the series $ \sum_{n=1}^{\infty}{b_n}$ diverges, we get $ H(P_{\infty})$ as infinite. In order that the entropy series converges, we need some restrictions. If there is a convergent series of positive terms $ \{q_n\}$ such that $ p_n\log \,p_n$ also converges. Then by use of the inequality (1.9), we get the following bound:
$\displaystyle H(P_{\infty}) \leq \log \Big(\sum_{n=1}^{\infty}{q_n}\Big)+\sum_{n=1}^{\infty}{p_n \log {1\over q_n}}.$
The following properties give better restrictions to bound the entropy series$ H(P_{\infty})$.

Property 1.64. If for some $ \varepsilon > 0,\sum_{n=1}^{\infty}{p_n}\ n^{\varepsilon}<\infty$, then$ H(P_{\infty})<\infty.$

Property 1.65. For each nondecreasing probability sequence $ \{p_n\}$ i.e., $ p_n\geq p_{n+1}, \ \forall\n$, the entropy series $ H(P_{\infty})$ converges iff the series

$\displaystyle E_p(\log)=\sum_{n=1}^{\infty}{p_n \log n}$
converges.

Property 1.66. For each nondecreasing sequence$ \{p_n\}$, the following bound on the entropy series holds:

$\displaystyle H(P_{\infty}) \leq \beta\ \lambda_0+\log\Big(\sum_{n=1}^{\infty}{ 2^{-\lambda_0a_n}}\Big), $
where$ \{a_n\}_{n\geq 1}$ is a nonnegative, nondecreasing sequence of numbers with the property that for some $ \lambda>0,\,\sum_{n=1}^{\infty}{2^{-\lambda_{a_n}} } < \infty$
$\displaystyle \lambda_0 =inf\left\{ \lambda \left\vert \sum_{n=1}^{\infty}{2^{-\lambda_{a_n}} }< \infty \right\}\right.,$
and 
$\displaystyle max\left\{\sum_{n=1}^{\infty}{a_np_n},\\lim_{\lambda\to \lambda......a_n}} } \over \sum_{n=1}^{\infty}{a_n 2^{-a_n} }}\right\} \leq \beta <\infty.$



Property 1.67. For each nondecreasing probability sequence $ \{p_n\}_{n\geq1}$, the following bound on the entropy series hold: 

$\displaystyle H(P_{\infty})\leq E_p(\log) +\sqrt{E_p(\log)}+ \log\Big(1+\sqrt{E_p(\log)}\Big),$
where$ E_p(\log)$ is as given in property 1.65.

Property 1.68. (Maximum-Entropy Principle). The entropy series $ H(P_{\infty})$ under the constraints

(a) $ p_n\geq0,\ \ \sum_{n=1}^{\infty}{p_n}=1.$
(b) $ \sum_{n=1}^{\infty}{p_ng_k(x_n)}=\overline{g}_k,\ k=1,2,...,\vartheta$
is maximized when
$\displaystyle p_n=exp\Big(-\lambda_0-\sum_{k=1}^{\vartheta}{\lambda_kg_k(x_n)}\Big),\n=1,2,...$
and the maximum value is given by
$\displaystyle H(P_{\infty})\vert _{max}=\lambda_0+\sum_{k=1}^{\vartheta}{\lambda_k\overline{g}_k},$
where $ \lambda_0,\lambda_1,...,\lambda_{\vartheta}$ are normalizing constants.

Note 1.6. Using the condition$ \sum_{n=1}^{\infty}{p_n}=1$, we get

$\displaystyle p_n ={exp\Big(-\sum_{k=1}^{\vartheta}{\lambda_kg_k(x_n)}\Big)\ov......{exp\Big(-\sum_{k=1}^{\vartheta}{\lambda_kg_k(x_n)}\Big)}},\,\,\,n=1,2,\cdots.$
    (1.12)

By applying the conditions$ \sum_{n=1}^{\infty}{p_ng_k(x_n)}=\overline{g}_k,\ k=1,2,...,\vartheta$, we can calculate the constants$ \lambda_1,\lambda_2,..., \lambda_{\vartheta}$. The probability distribution (1.12) is famous in the literature as ``Gibbs distribution''.

Particular cases: The property 1.69 admits the following interesting particular cases.

(i) The probability distribution $ p_i\geq 0$$ \sum_{i=1}^n{p_i}=1$ that maximizes the corresponding entropy $ H(P)=-\sum_{i=1}^n{p_i\log p_i}$ is the uniform distribution given by $ p_i={1\overn}, \ \forall \ i=1,2,...,n$.
(ii) The probability distribution $ p_n\geq 0$$ \sum_{n=0}^{\infty}{p_n}=1$ that maximizes the corresponding entropy series$ H(P_{\infty})=-\sum_{n=0}^{\infty}{p_n \ell n \,,p_n}$ subject to the constraint $ \sum_{n=0}^{\infty}{np_n}=m$ is the geometric distribution given by $ p_n=(1-b)b^n, \ \ n=0,1,2,...$ , where b can be obtained by using the given constraints.
(iii) The probability distribution $ p_n\geq 0$$ n \in Z$ (integers) $ \sum_{n \in Z}{p_n}=1$ that maximizes the corresponding entropy $ H(P_{\infty})=-\sum_{n \in Z}{p_n \logp_n}$ subject to the constraints $ \sum_{n \in Z}{np_n}=m$ and$ \sum_{n \in Z}{n^2p_n^2}=m^2+\sigma^2$ is the discrete normal distribution given by $ p_n= exp(-\alpha-\beta n\ -\gamman^2)$, where $ \alpha,\beta\ {\rm and}\ \gamma$ can be obtained by using the given constraints.
Property 1.69. Let $ \{a_1,a_2,...\}$ be a sequence of real numbers with the property that for some $ \mu,\\sum_{n=1}^{\infty}{2^{-\mu a_n}}<\infty$, then the following bound on the entropy series hold: 
$\displaystyle H(P_{\infty}) \leq \mu\Big(\sum_{n=1}^{\infty}{p_na_n}\Big)+ \log\Big(\sum_{n=1}^{\infty}{2^{-\mu a_n}}\Big).$



Property 1.70. The following bound on the entropy series holds: 

$\displaystyle H(P_{\infty})=\leqE_p\big(\lfloor\log\rfloor\big)+\big(E_p\bi......\rfloor\big)+1\big)\psi \Big({1\overE_p\big(\lfloor\log\rfloor\big)+1} \Big),$
where $ \psi(p)$ is as given in (1.8).

The following properties are also worth emphasizing, whose details can be seen in Capocelli et al. (1988a;b)[25] [26].

Property 1.71. The following bound on the entropy series hold: 

$\displaystyle H(P_{\infty})\leq \mu E_p(\log) +\log\zeta(\mu),$
where $ \zeta(\mu)$ is the Riemann zeta function and$ \mu$ is the unique solution of the equation
$\displaystyle E_p(\log)=\sum_{k=1}^{\infty}{{\Delta(k)\over k^{\mu}}},$
and$ \Delta(k)$ is defined as $ \ell n \ p$ when $ k$ is a power of a prime p; 0 otherwise.

Property 1.72. The following bound on the entropy series hold: 

$\displaystyle H(P_{\infty})\leq E_p(\log)\big[1+\psi\big({1\over E_p(\log)}\big)\big]+\beta,$
where $ \beta$ is a bounded function of $ E_p(\log):-1\leq\beta \leq \Delta p-1$, with
$\displaystyle \Delta_{p} = \left\{ \begin{array}{ll}3,& 1\leq E_p(\log) <\big......+\psi\big({1\over E_p(\log)} \big), & \mbox{otherwise}\end{array}\right. $



Property 1.73. For each nonincreasing probability distribution $ P_{\infty}$ and for each $ k \inI\!\!N^+$, the following bound on the entropy series holds: 

$\displaystyle H(P_{\infty})\leq\sum_{i=0}^k{\log^iE_p(\log)}+\log^kE_p(\log)+c_k,$
where
$\displaystyle c_k=\log \Big[ \sum_{n=1}^{\infty}{exp_2\Big(-\big(\log^{k+1} n+\sum_{i=1}^{k+1}{\log^i n}\big)\Big)}\Big]+0.766k+0.531$
is a constant independent of the probability distribution $ P_{\infty}$ and for $ x\geq 0$$ i \in I\!\!N^+$, we have 
$\displaystyle \log^0 x=x$
and
$\displaystyle \log^i x=\left\{ \begin{array}{ll}\log(\log^{i-1} x),& \log^{i-1} x \geq 1 \\  0, & \mbox {otherwise}\end{array}\right.$



Property 1.74. For each nonincreasing probability distribution $ P_{\infty}$, the following bound on the entropy series holds: 

$\displaystyle H(P_{\infty}) <E_p(\log)+\log^*\big(E_p(\log)\big)+2+\log 2.87,$
where
$\displaystyle \log^* x=\left\{ \begin{array}{ll}0, & 0\leq x\leq 1 \\  \log x+\log^*(\log x), & x\geq 1\end{array}\right. $



Property 1.75. For each nonincreasing probability distribution $ P_{\infty}$ and for each $ k \inI\!\!N^+$, the following limits hold:

(i) $ \lim_{E_p(\log)\to \infty}(A-B)=\infty,$
(ii) $ \lim_{E_p(\log)\to \infty}(B-C_1)=\infty,$
(iii) $ \lim_{E_p(\log)\to \infty}(C_k-C_{k+1})=\infty,$
(iv) $ \lim_{E_p(\log)\to \infty}(C_k-D)=\infty,$
where 
$\displaystyle A=E_p(\log)+2\big(1+\sqrt{E_p(\log)}\big),$
$\displaystyle B=E_p(\log)+\sqrt{E_p(\log)}+\log\big(\sqrt{E_p(\log)}+1\big),$
$\displaystyle C_k=\sum_{i=0}^k{\log^iE_p(\log)}+\log^kE_p(\log)+c_k,\ k\ \in\I\!\!N^+,$
and 
$\displaystyle D=E_p(\log)+\log^*E_p(\log)+2+\log 2.87.$
Note 1.7. The good references for the sections 1.4, 1.5, 1.6 and 1.7 are the book by Csiszár and Körner (1981) [30], Guiasu (1977) [41], McEliece (1977) [72], etc..
21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil