Abstract : We consider the widely-used average-linkage, single-linkage, and Ward's methods for computing hierarchical clusterings of high-dimensional Euclidean inputs. It is easy to show that there is no efficient implementation of these algorithms in high dimensional Euclidean space since it implicitly requires to solve the closest pair problem, a notoriously difficult problem. However, how fast can these algorithms be implemented if we allow approxima-tion? More precisely: these algorithms successively merge the clusters that are at closest average (for average-linkage), minimum distance (for single-linkage), or inducing the least sum-of-square error (for Ward's). We ask whether one could obtain a significant running-time improvement if the algorithm can merge γ-approximate closest clusters (namely, clusters that are at distance (average, minimum , or sum-of-square error) at most γ times the distance of the closest clusters). We show that one can indeed take advantage of the relaxation and compute the approximate hierarchical clustering tree using r Opnq γ-approximate nearest neighbor queries. This leads to an algorithm running in time r Opndqn 1Op1{γq for d-dimensional Euclidean space. We then provide experiments showing that these algorithms perform as well as the non-approximate version for classic classification tasks while achieving a significant speed-up.
Document type :
Conference papers

Cited literature [39 references]

https://hal.archives-ouvertes.fr/hal-02360775
Submitted on : Wednesday, November 13, 2019 - 9:17:03 AM
Last modification on : Wednesday, November 20, 2019 - 9:53:03 AM
Long-term archiving on: : Friday, February 14, 2020 - 2:02:53 PM

### File

main.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-02360775, version 1

### Citation

Amir Abboud, Vincent Cohen-Addad, Hussein Houdrougé. Subquadratic High-Dimensional Hierarchical Clustering. NeurIPS'19 - 33rd Conference on Neural Information Processing Systems, Dec 2019, Vancouver, Canada. ⟨hal-02360775⟩

Record views