Multi-Relation Attention Network for Image Patch Matching - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Image Processing Année : 2021

Multi-Relation Attention Network for Image Patch Matching

Dou Quan
Shuang Wang
Yi Li
Bowu Yang
Ning Huyan
  • Fonction : Auteur
Biao Hou
Licheng Jiao

Résumé

Deep convolutional neural networks attract increasing attention in image patch matching. However, most of them rely on a single similarity learning model, such as feature distance and the correlation of concatenated features. Their performances will degenerate due to the complex relation between matching patches caused by various imagery changes. To tackle this challenge, we propose a multi-relation attention learning network (MRAN) for image patch matching. Specifically, we propose to fuse multiple feature relations (MR) for matching, which can benefit from the complementary advantages between different feature relations and achieve significant improvements on matching tasks. Furthermore, we propose a relation attention learning module to learn the fused relation adaptively. With this module, meaningful feature relations are emphasized and the others are suppressed. Extensive experiments show that our MRAN achieves best matching performances, and has good generalization on multi-modal image patch matching, multi-modal remote sensing image patch matching and image retrieval tasks.
Fichier non déposé

Dates et versions

hal-03429620 , version 1 (15-11-2021)

Identifiants

Citer

Dou Quan, Shuang Wang, Yi Li, Bowu Yang, Ning Huyan, et al.. Multi-Relation Attention Network for Image Patch Matching. IEEE Transactions on Image Processing, 2021, 30, pp.7127-7142. ⟨10.1109/TIP.2021.3101414⟩. ⟨hal-03429620⟩
68 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More