Skip to Main content Skip to Navigation
Journal articles

Normalized information-based divergences

Abstract : This paper is devoted to the mathematical study of some divergences based on the mutual information well-suited to categorical random vectors. These divergences are generalizations of the ''entropy distance" and ''information distance". Their main characteristic is that they combine a complexity term and the mutual information. We then introduce the notion of (normalized) information-based divergence, propose several examples and discuss their mathematical properties in particular in some prediction framework.
Complete list of metadata
Contributor : Jean-François Coeurjolly <>
Submitted on : Monday, November 13, 2006 - 9:24:12 AM
Last modification on : Thursday, November 19, 2020 - 1:00:53 PM
Long-term archiving on: : Monday, September 20, 2010 - 4:45:49 PM




Jean-François Coeurjolly, Rémy Drouilhet, Jean-François Robineau. Normalized information-based divergences. Problems of Information Transmission, MAIK Nauka/Interperiodica, 2007, 43 (3), pp.167-189. ⟨10.1134/S0032946007030015⟩. ⟨hal-00022566v2⟩



Record views


Files downloads