Efficient top rank optimization with gradient boosting for supervised anomaly detection

Abstract : In this paper we address the anomaly detection problem in a supervised setting where positive examples might be very sparse. We tackle this task with a learning to rank strategy by optimizing a dif-ferentiable smoothed surrogate of the so-called Average Precision (AP). Despite its non-convexity, we show how to use it efficiently in a stochas-tic gradient boosting framework. We show that using AP is much better to optimize the top rank alerts than the state of the art measures. We demonstrate on anomaly detection tasks that the interest of our method is even reinforced in highly unbalanced scenarios.
Document type :
Conference papers
Liste complète des métadonnées

Cited literature [26 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01613561
Contributor : Marc Sebban <>
Submitted on : Monday, October 9, 2017 - 5:40:26 PM
Last modification on : Thursday, July 26, 2018 - 1:10:39 AM
Document(s) archivé(s) le : Wednesday, January 10, 2018 - 3:14:27 PM

File

ECML2017.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01613561, version 1

Citation

Jordan Frery, Amaury Habrard, Marc Sebban, Olivier Caelen, Liyun He-Guelton. Efficient top rank optimization with gradient boosting for supervised anomaly detection. ECML-PKDD 2017, Sep 2017, Skopje, Macedonia. ⟨hal-01613561⟩

Share

Metrics

Record views

260

Files downloads

181