Next:Measures of Uncertainty: Shannon'sUp:Shannon's Entropy
Previous:Shannon's EntropyGo to:Table of Contents

Introduction


The concept of Shannon's entropy (Shannon's (1948) [86]) is the central role of information theory sometimes referred as measure of uncertainty. The entropy of a random variable is defined in terms of its probability distribution and can be shown to be a good measure of randomness or uncertainty. This chapter mainly deals with its characterizations and properties. Properties for discrete finite random variable are studied. The study is extended to random vectors with finite and infinite values. The idea of entropy series is explained. Finally, the continuous case generally referred as differential entropy with different probability distributions and power inequality are studied.
 


21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil