Next:Entropy of kindUp:Generalized Entropy Measures
Previous:Entropies of Order and Go to:Table of Contents

Entropy of Degree s and Degree (r,s)

For operational purposes, it seems more natural to consider, the slar expression $ \sum_{i=1}^n{p^r_i}$ as an information measure instead of Rényi's entropy of order $ r$. So, Havrda and Charvát (1967) [46] proposed the following entropy of degree $ s$:

$\displaystyle H^s(P)=(2^{1-s}-1)^{-1}\bigg[\sum_{i=1}^n{p^s_i}-1\bigg],\ s\neq1,\ s>0,$
    (3.3)

for all $ P=(p_1,p_2,...,p_n)\ \in\ \Delta_n$. In these case also, we can easily verify that $ \lim_{s\to 1}{H^s(P)}=H(P)$.

This quantity permits a simpler characterization (ref. Havrda and Charvát, 1967) [46]. Daróczy (1970) [33] gave an alternative approach to characterize it (see section 3.5.1).

Sharma and Taneja (1975; 1977) [92], [93] studied a generalization of $ H^s(P)$ involving two scalar parameters, known entropy of degree $ (r,s)$, and is given by

$\displaystyle H^{r,s}(P)={(2^{1-r}-2^{1-s})}^{-1}\sum_{i=1}^n{(p^r_i-p^s_i)},\r\neq s,\ r>0,\ s>0$
    (3.4)

for all $ P=(p_1,p_2,...,p_n)\ \in\ \Delta_n$, where $ r$ and $ s$ are real parameters.

In particular, when $ r=1$ or $ s=1$, the measure (3.4) reduces to (3.3). In the limiting case, we have 

$\displaystyle \lim_{r\tos}{H^{r,s}(P)}=-2^{r-1}\ \sum_{i=1}^n{p^r_i\ \log{p_i}}, \ r>0,$
It reduces to Shannon's entropy for $ r=1$.
 

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil