Let be a discrete random variable taking a finite number of possible values with probabilities respectively such that . We attempt to arrive at a number that will measure the amount of uncertainty. Let be a function defined on the interval and be interpreted as the uncertainty associated with the event , or the information conveyed by revealing that has taken on the value in a given performance of the experiment. For each n, we shall define a function of the n variables . The function is to be interpreted as the average uncertainty associated with the event given by
Thus is the average uncertainty removed by revealing the value of . For simplicity we shall denote
for all and . Replacing by we get
Based on (1.2) and (1.3) we present the following theorem.
Theorem 1.1. Let be a function satisfying (1.2) and (1.3), where is real valued continuous function defined over [0,1]. Then is given by
where with .
The proof is based on the following Lemma (ref. Chaundy and McLeod, 1961 ).
Lemma 1.1. Let be a continuous function satisfying
for all Then
for all with .
Alternatively the measure (1.4) can be characterized as follows (ref. Shannon,1948 ; Feinstein, 1958 ).
Theorem 1.2. Let be a function satisfying the following axioms:
A third way to characterize the measure (1.4) is as follows (ref. Aczél and Daróczy, 1975 ).
Theorem 1.3. Let be a function satisfying the following axioms:
The following is a different way to characterize the measure (1.4). It is based on the functional equation famous as fundamental equation of information.
Theorem 1.4. Let be a function satisfying
For simplicity, let us take in (1.4). If we put the restriction in the above theorems with , we get . This yields
For more characterizations of the measure (1.4) or (1.7) refer to Aczél
and Daróczy (1975)  and Mathai
and Rathie (1975) .