NeuMiss networks: differentiable programming for supervised learning with missing values - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

NeuMiss networks: differentiable programming for supervised learning with missing values

Résumé

The presence of missing values makes supervised learning much more challenging. Indeed, previous work has shown that even when the response is a linear function of the complete data, the optimal predictor is a complex function of the observed entries and the missingness indicator. As a result, the computational or sample complexities of consistent approaches depend on the number of missing patterns, which can be exponential in the number of dimensions. In this work, we derive the analytical form of the optimal predictor under a linearity assumption and various missing data mechanisms including Missing at Random (MAR) and self-masking (Missing Not At Random). Based on a Neumann-series approximation of the optimal predictor, we propose a new principled architecture, named NeuMiss networks. Their originality and strength come from the use of a new type of non-linearity: the multiplication by the missingness indicator. We provide an upper bound on the Bayes risk of NeuMiss networks, and show that they have good predictive accuracy with both a number of parameters and a computational complexity independent of the number of missing data patterns. As a result they scale well to problems with many features, and remain statistically efficient for medium-sized samples. Moreover, we show that, contrary to procedures using EM or imputation, they are robust to the missing data mechanism, including difficult MNAR settings such as self-masking.
Fichier principal
Vignette du fichier
main.pdf (4.06 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02888867 , version 1 (03-07-2020)
hal-02888867 , version 2 (12-10-2020)
hal-02888867 , version 3 (13-10-2020)

Identifiants

Citer

Marine Le Morvan, Julie Josse, Thomas Moreau, Erwan Scornet, Gaël Varoquaux. NeuMiss networks: differentiable programming for supervised learning with missing values. NeurIPS 2020 - 34th Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtual, Canada. ⟨hal-02888867v3⟩
1023 Consultations
350 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More