Next:Unified Relative InformationUp:Generalized Information and Divergence
Previous:Generalized Information and Divergence Go to:Table of Contents

Generalized Information Measures


In this chapter we shall give two parametric generalizations of the Kullback-Leibler's relative information and Kerridge's inaccuracy. These generalizations are similar to that of entropy type measures studied in chapter 3. Generalizations of Shannon-Gibbs inequalities are also studied.


Subsections

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil