Next:First GeneralizationsUp:Generalized Information and Divergence
Previous:Properties of Unified Inaccuracies Go to:Table of Contents

Unified (r,s)-Divergence Measures

In section 2.4 we studied the divergence measures known as Jensen difference divergence measure (or information radius) i.e. $ I-$divergence and Jeffrey-Kullback-Leibler's $ J-$divergence. Here our aim is to present different generalizations of these two measures having two scalar parameters. Their extension to $ M-$class case is given in Chapter 5.

We see that the $ I$ and $ J-$divergence measures given respectively in (2.9) and (2.7) depend on the relative information, $ D(P\vert\vert Q)$. Based on the unified expression $ {\ensuremath{\boldsymbol{\mathscr{D}}}}^s_r(P\vert\vert Q)$ and the expressions (2.9) and (2.7) we can generalize the $ I$ and $ J-$divergence measures. This we have done in the first generalization. An alternative approach to generalize the $ I$ and $ J-$divergence measures is also given, and it is based on the expressions appearing in the particular case of the first generalizations.


Subsections

21-06-2001
Inder Jeet Taneja
Departamento de Matemática - UFSC
88.040-900 Florianópolis, SC - Brazil