Skip to Main content Skip to Navigation
Journal articles

Coupled dictionary learning for unsupervised change detection between multimodal remote sensing images

Abstract : Archetypal scenarios for change detection generally consider two images acquired through sensors of the same modality. However, in some specific cases such as emergency situations, the only images available may be those acquired through sensors of different modalities. This paper addresses the problem of unsupervisedly detecting changes between two observed images acquired by sensors of different modalities with possibly different resolutions. These sensor dissimilarities introduce additional issues in the context of operational change detection that are not addressed by most of the classical methods. This paper introduces a novel framework to effectively exploit the available information by modeling the two observed images as a sparse linear combination of atoms belonging to a pair of coupled overcomplete dictionaries learnt from each observed image. As they cover the same geographical location, codes are expected to be globally similar, except for possible changes in sparse spatial locations. Thus, the change detection task is envisioned through a dual code estimation which enforces spatial sparsity in the difference between the estimated codes associated with each image. This problem is formulated as an inverse problem which is iteratively solved using an efficient proximal alternating minimization algorithm accounting for nonsmooth and nonconvex functions. The proposed method is applied to real images with simulated yet realistic and real changes. A comparison with state-of-the-art change detection methods evidences the accuracy of the proposed strategy.
Document type :
Journal articles
Complete list of metadata

Cited literature [54 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02397250
Contributor : Open Archive Toulouse Archive Ouverte (oatao) <>
Submitted on : Friday, December 6, 2019 - 2:32:53 PM
Last modification on : Thursday, March 18, 2021 - 2:15:43 PM
Long-term archiving on: : Saturday, March 7, 2020 - 2:54:44 PM

File

ferraris_25034.pdf
Files produced by the author(s)

Identifiers

Citation

Vinicius Ferraris, Nicolas Dobigeon, Yanna Cruz Cavalcanti, Thomas Oberlin, Marie Chabert. Coupled dictionary learning for unsupervised change detection between multimodal remote sensing images. Computer Vision and Image Understanding, Elsevier, 2019, 189, pp.1-15. ⟨10.1016/j.cviu.2019.102817⟩. ⟨hal-02397250⟩

Share

Metrics

Record views

335

Files downloads

891