Next:Entropy of Degree andUp:Generalized Entropy Measures
Previous:Generalized Entropy Measures Go to:Table of Contents

Entropies of Order r and Order (r,s)

A systematic attempt to develop a generalizations of Shannon's entropy was carried by out Rényi (1961) [82], who characterized an entropy of order $ r$ given by
$\displaystyle H_r(P)={(1-r)}^{-1}\; log \biggl(\sum_{i=1}^n{p^r_i}\biggr), \r\neq 1,\ r> 0,$
    (3.1)
for all $ P=(p_1,p_2,...,p_n)\ \in\ \Delta_n$, where $ r$ is a real parameter. We can easily verify that 
$\displaystyle \lim_{r \to1}{H_r(P)}=H(P),$

where H(P) is the Shannon's entropy given by (1.7).

Campbell (1965) [20] for the first time, has shown that the variable lenght version of the elementary coding theorem carries over to the entropy of order $ r$, if one considers exponential lenght and its increasing functions, which includes, in particular, a generalized lenght in terms of entropy of order$ r$. Blumer and McEliece (1988) [14] considered the problem of minimizing redundancy of order $ r$ defined in terms of entropy of order $ r$, and obtained bounds sharper than that of Gallager (1978) [39]. Taneja (1984a) [101] extended the concept of exponentiated average codeword lenght of order $ r$ to the best 1:1 codes. For some other aplications of entropy of order $ r$ refer to Jelinek (1968a;b) [50] [51], Jelinek and Schneider (1972) [52], Csiszár (1974) [29], Nath (1975) [74], Arimoto (1975; 1976) [4], [5], Ben-Bassat and Raviv (1978) [11], Kieffer (1979) [64], Campbell (1985) [22], Kapur (1983; 1986) [55], [56] etc..

Based on the same motivations of Rényi, later researchers (Aczél and Daróczy, 1963 [1]; Varma, 1966 [119]; Kapur, 1967 [54]; Rathie, 1970 [78], etc..) generalized the entropy of order $ r$ by changing some of its postulates. The generaliation studied by Aczél and Daróczy (1963) [1], known as entropy of order$ (r,s)$ is given by

$\displaystyle H_{r,s}(P)={(s-r)}^{-1}\; log \Biggl({\sum_{i=1}^n{p^r_i}\over\sum_{i=1}^n{p^s_i}}\Biggr), \ r\neq s,\ r> 0,\ s> 0,$
    (3.2)

for all $ P=(p_1,p_2,...,p_n)\ \in\ \Delta_n$, where $ r$ and $ s$ are real parameters. In particular, when $ r=1$ or $ s=1$, the measure (3.2) reduces to (3.1). We can easily verify that 

$\displaystyle \lim_{r \tos}{H_{r,s}(P)}=-\sum_{i=1}^n{p^s_i \log\ {p_i}}\Big/\sum_{i=1}^n{p^s_i},\ s>0,$

that reduces to Shannon's entropy for$ s=1$.
 


21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil