Skip to Main content Skip to Navigation

Fair Regression via Plug-in Estimator and Recalibration With Statistical Guarantees

Abstract : We study the problem of learning an optimal regression function subject to a fairness constraint. It requires that, conditionally on the sensitive feature, the distribution of the function output remains the same. This constraint naturally extends the notion of demographic parity, often used in classification, to the regression setting. We tackle this problem by leveraging on a proxy-discretized version, for which we derive an explicit expression of the optimal fair predictor. This result naturally suggests a two stage approach, in which we first estimate the (unconstrained) regression function from a set of labeled data and then we recalibrate it with another set of unlabeled data. The recalibration step can be efficiently performed via a smooth optimization. We derive rates of convergence of the proposed estimator to the optimal fair predictor both in terms of the risk and fairness constraint. Finally, we present numerical experiments illustrating that the proposed method is often superior or competitive with state-of-the-art methods.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [18 references]  Display  Hide  Download
Contributor : Evgenii Chzhen <>
Submitted on : Friday, March 6, 2020 - 4:01:14 PM
Last modification on : Thursday, March 19, 2020 - 12:26:03 PM


Files produced by the author(s)


  • HAL Id : hal-02501190, version 1



Evgenii Chzhen, Christophe Denis, Mohamed Hebiri, Luca Oneto, Massimiliano Pontil. Fair Regression via Plug-in Estimator and Recalibration With Statistical Guarantees. 2020. ⟨hal-02501190⟩



Record views


Files downloads