Jeffrey’s divergence between autoregressive processes disturbed by additive white noises - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Signal Processing Année : 2018

Jeffrey’s divergence between autoregressive processes disturbed by additive white noises

Résumé

Jeffrey’s divergence (JD), which is the symmetric version of the Kullback-Leibler divergence, has been used in a wide range of applications, from change detection to clutter homogeneity analysis in radar processing. It has been calculated between the joint probability density functions of successive values of autoregressive (AR) processes. In this case, the JD is a linear function of the variate number to be considered. Knowing the derivative of the JD with respect to the number of variates is hence enough to compare noise-free AR processes. However, the processes can be disturbed by additive uncorrelated white noises. In this paper, we suggest comparing two noisy 1st-order AR processes. For this purpose, the JD is expressed from the JD between noise-free AR processes and the bias the noises induce. After a transient period, the derivative of this bias with respect to the variate number becomes constant as well as the derivative of the JD. The resulting asymptotic JD increment is then used to compare noisy AR processes. Some examples illustrate this theoretical analysis.
Fichier non déposé

Dates et versions

hal-01739434 , version 1 (21-03-2018)

Identifiants

  • HAL Id : hal-01739434 , version 1

Citer

Léo Legrand, Eric Grivel. Jeffrey’s divergence between autoregressive processes disturbed by additive white noises. Signal Processing, In press. ⟨hal-01739434⟩
299 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More