Entropies and entropic criteria

Abstract : This chapter focuses on the notions of entropy and of maximum entropy distribution which will be characterized according to different perspectives. Beyondlinks with applications in engineering and physics, it will beshown that it is possible to build regularization functionals based on the use of a maximum entropy technique, which canthen possibly be employed as \emph{ad hoc} potentials in data inversion problems.The chapter begins with an overview of the key propertiesof information measures, and with the introduction of various conceptsand definitions. In particular, the R\'enyi divergence is defined,the concept of escort\index{escort distribution} distribution is presented, and the principleof maximum entropy that will be subsequently used will be commented on. A conventional engineering problem is then presented, the problem of source\index{source coding} coding, and it shows the benefit of using measures with adifferent length than the standard measure, and in particular an exponential measure,which leads to a source\index{source coding} coding theorem whose minimum boundis a R\'enyi entropy. It is also shown that optimal codes can be easily calculated with escort\index{escort distribution} distributions. InSection~\ref{sec:Un-mod=00003D0000E8le-simple}, a simple state transition model is introduced and examined. This model leads toan equilibrium distribution defined as a generalized escort\index{escort distribution} distribution,and as a by-product leads once again to a R\'enyi\index{entropy!R\'enyi@R\'enyi} entropy. The Fisher information flow along thecurve defined by the generalized escort\index{escort distribution} distribution is examined andconnections with the Jeffreys divergence are achieved. Finally,various arguments are obtained which, in this framework, lead to aninference method based on the minimization of the R\'enyi entropy undera generalized mean constraint, that is to say, taken with regardto the escort\index{escort distribution} distribution. From subsection~\ref{sub:Fonctionnelles-entropiques-issue},the main concern is about the minimization of the R\'enyi divergence subject toa generalized average constraint. The optimal densitythat solves this problem, and the value of the correspondingoptimal divergence are given and characterized. The mainproperties of any entropy that may be related are defined and characterized. Finally, it is shownhow to practically calculate these entropies and how it can be envisaged to use them for solving linear problems.
Type de document :
Chapitre d'ouvrage
Jean-François Giovannelli; Jérôme Idier. Inversion methods applied to signal and image processing, Wiley, pp.26, 2015
Liste complète des métadonnées

Littérature citée [15 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01087579
Contributeur : Jean-François Bercher <>
Soumis le : mercredi 26 novembre 2014 - 12:54:45
Dernière modification le : dimanche 13 mars 2016 - 11:18:05
Document(s) archivé(s) le : vendredi 14 avril 2017 - 20:46:27

Fichier

chap11.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01087579, version 1

Collections

Citation

Jean-François Bercher. Entropies and entropic criteria. Jean-François Giovannelli; Jérôme Idier. Inversion methods applied to signal and image processing, Wiley, pp.26, 2015. 〈hal-01087579〉

Partager

Métriques

Consultations de
la notice

230

Téléchargements du document

176