The unified conditional
entropies for a fixed value of a random variable are given by
for each
and
for each .
We shall now give four different ways to define unified conditional entropies. The first is based on the natural way, as in Shannon's case. The second and third forms are obtained from some of the expressions appearing in between the first form. The fourth is based on the well-known property of Shannon's entropy. These are given by
|
(6.1) |
for all r,s , =1,2,3 and 4, where for , , we have
|
(6.2) |
|
(6.3) |
|
(6.4) |
and
|
(6.5) |
In view of (6.2), we have
Also, we observe that, and . For, we have .
In a similar way, we can define the generalized unified conditional and joint entropies for three or more random variables etc., such as , , ( and ), etc..
Note 6.1. is defined in a natural way as of Shannon's entropy. It has been first given by Ben-Bassat and Raviv (1978) [11]. can be seen in Aczél and Daróczy (1963) [1]. has been taken by Arimoto (1975) [4]. has been first considered by Daróczy (1970) [33]. Sharma and Mittal (1975) [90] discussed the cases and . While we have considered here the much more general and unified way of each of these generalized conditional entropies
In this chapter, we shall study some properties of the generalized conditional
entropies defined above. For proof refer to Rathie and Taneja (1991) [81]
and Taneja (1988;1995) [104], [108].