Skip to Main content Skip to Navigation
Conference papers

Fair Regression via Plug-in Estimator and Recalibration With Statistical Guarantees

Abstract : We study the problem of learning an optimal regression function subject to a fairness constraint. It requires that, conditionally on the sensitive feature, the distribution of the function output remains the same. This constraint naturally extends the notion of demographic parity, often used in classification, to the regression setting. We tackle this problem by leveraging on a proxy-discretized version, for which we derive an explicit expression of the optimal fair predictor. This result naturally suggests a two stage approach, in which we first estimate the (unconstrained) regression function from a set of labeled data and then we recalibrate it with another set of unlabeled data. The recalibration step can be efficiently performed via a smooth optimization. We derive rates of convergence of the proposed estimator to the optimal fair predictor both in terms of the risk and fairness constraint. Finally, we present numerical experiments illustrating that the proposed method is often superior or competitive with state-of-the-art methods.
Complete list of metadata

Cited literature [18 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02501190
Contributor : Evgenii Chzhen <>
Submitted on : Friday, March 6, 2020 - 4:01:14 PM
Last modification on : Friday, April 30, 2021 - 9:52:37 AM
Long-term archiving on: : Sunday, June 7, 2020 - 3:01:00 PM

Files

main.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02501190, version 1

Citation

Evgenii Chzhen, Christophe Denis, Mohamed Hebiri, Luca Oneto, Massimiliano Pontil. Fair Regression via Plug-in Estimator and Recalibration With Statistical Guarantees. NeurIPS 2020 - 34th Conference on Neural Information Processing Systems, Dec 2020, Vancouver / Virtuel, Canada. ⟨hal-02501190⟩

Share

Metrics

Record views

489

Files downloads

578