Optimization with Sparsity-Inducing Penalties

Francis Bach 1, 2 Rodolphe Jenatton 1, 2 Julien Mairal 3 Guillaume Obozinski 2, 1
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate non-smooth norms. The goal of this paper is to present from a general perspective optimization tools and techniques dedicated to such sparsity-inducing penalties. We cover proximal methods, block-coordinate descent, reweighted $\ell_2$-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provide an extensive set of experiments to compare various algorithms from a computational point of view.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-00613125
Contributor : Francis Bach <>
Submitted on : Sunday, November 20, 2011 - 2:56:23 PM
Last modification on : Wednesday, August 7, 2019 - 12:19:23 PM
Long-term archiving on : Tuesday, February 21, 2012 - 2:20:18 AM

Files

Bach-Jenatton-Mairal-Obozinski...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00613125, version 2
  • ARXIV : 1108.0775

Collections

Citation

Francis Bach, Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski. Optimization with Sparsity-Inducing Penalties. Foundations and Trends in Machine Learning, Now Publishers, 2011. ⟨hal-00613125v2⟩

Share

Metrics

Record views

2939

Files downloads

1105