Constrained Convex Neyman-Pearson Classification Using an Outer Approximation Splitting Method
Résumé
This paper presents an algorithm for Neyman-Pearson classification. While empirical risk
minimization approaches focus on minimizing a global risk, the Neyman-Pearson framework
minimizes the type II risk under an upper bound constraint on the type I risk. Since
the 0=1 loss function is not convex, optimization methods employ convex surrogates that
lead to tractable minimization problems. As shown in recent work, statistical bounds can
be derived to quantify the cost of using such surrogates instead of the exact 1/0 loss.
However, no specific algorithm has yet been proposed to actually solve the resulting minimization
problem numerically. The contribution of this paper is to propose an efficient
splitting algorithm to address this issue. Our method alternates a gradient step on the objective
surrogate risk and an approximate projection step onto the constraint set, which is
implemented by means of an outer approximation subgradient projection algorithm. Experiments
on both synthetic data and biological data show the efficiency of the proposed method.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...