Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression

Abstract : In high dimension, it is customary to consider Lasso-type estimators to enforce sparsity. For standard Lasso theory to hold, the regulariza-tion parameter should be proportional to the noise level, which is often unknown in practice. A remedy is to consider estimators such as the Concomitant Lasso, which jointly optimize over the regression coefficients and the noise level. However, when data from different sources are pooled to increase sample size, noise levels differ and new dedicated estima-tors are needed. We provide new statistical and computational solutions to perform het-eroscedastic regression, with an emphasis on brain imaging with magneto-and electroen-cephalography (M/EEG). When instantiated to de-correlated noise, our framework leads to an efficient algorithm whose computational cost is not higher than for the Lasso, but addresses more complex noise structures. Experiments demonstrate improved prediction and support identification with correct estimation of noise levels.
Document type :
Conference papers
Complete list of metadatas

Cited literature [35 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01812011
Contributor : Mathurin Massias <>
Submitted on : Monday, June 11, 2018 - 10:27:14 AM
Last modification on : Monday, February 10, 2020 - 6:14:08 PM
Long-term archiving on: Wednesday, September 12, 2018 - 9:37:03 PM

File

massias18a.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01812011, version 1

Citation

Mathurin Massias, Olivier Fercoq, Alexandre Gramfort, Joseph Salmon. Generalized Concomitant Multi-Task Lasso for Sparse Multimodal Regression. 21st International Conference on Artificial Intelligence and Statistics (AISTATS 2018), Apr 2018, Lanzarote, Spain. ⟨hal-01812011⟩

Share

Metrics

Record views

218

Files downloads

105