In this section we extend the notion of generalized entropies to
the continuous case, and examine some properties of the resulting unified
entropy
function.
Let X be an absolutely continuous random variable, that is, a random
variable having a probability density function p(x). The unified
entropy
of X is defined as follows:
![]() |
(3.14) |
provided the integrals exist, where
,
.
The contrast between continuous and discrete distribution is worth emphasising:
![$\displaystyle H^s_r(X)=(2^{1-s}-1)^{-1}\Big[\big({\lambda^{r-1}\overr}\big)^{s-1\over r-1} -1\Big],\ \ r\neq 1,\ s\neq 1$](img737.gif)
Example 3.2. Let
be a discrete random variable taking the values
with equal probabilities
.
Then

Example 3.3. We consider a function
,
where
is a stricly increasing function of
.
Since the mapping from
to
is one to one, we have
![]()


These important differences between discrete and continuous cases are
a warning that the results for the discrete distributions cannot be translated
to continuous case without independent verification. Fortunately, some
of the significant concepts rely upon differences between entropies and
for these the difficulties disappear.