End-to-End Similarity Learning and Hierarchical Clustering for unfixed size datasets - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

End-to-End Similarity Learning and Hierarchical Clustering for unfixed size datasets

Résumé

Hierarchical clustering (HC) is a powerful tool in data analysis since it allows discovering patterns in the observed data at different scales. Similarity-based HC methods take as input a fixed number of points and the matrix of pairwise similarities and output the dendrogram representing the nested partition. However, in some cases, the entire dataset cannot be known in advance and thus neither the relations between the points. In this paper, we consider the case in which we have a collection of realizations of a random distribution, and we want to extract a hierarchical clustering for each sample. The number of elements varies at each draw. Based on a continuous relaxation of Dasgupta's cost function, we propose to integrate a triplet loss function to Chami's formulation in order to learn an optimal similarity function between the points to use to compute the optimal hierarchy. Two architectures are tested on four datasets as approximators of the similarity function. The results obtained are promising and the proposed method showed in many cases good robustness to noise and higher adaptability to different datasets compared with the classical approaches.
Fichier principal
Vignette du fichier
GSI2021_Gigli.pdf (1.73 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03228070 , version 1 (17-05-2021)

Identifiants

  • HAL Id : hal-03228070 , version 1

Citer

Leonardo Gigli, Beatriz Marcotegui, Santiago Velasco-Forero. End-to-End Similarity Learning and Hierarchical Clustering for unfixed size datasets. 5th conference on Geometric Science of Information, Jul 2021, Paris, France. ⟨hal-03228070⟩
4696 Consultations
149 Téléchargements

Partager

Gmail Facebook X LinkedIn More