Iterative Isotonic Regression - Archive ouverte HAL Access content directly
Journal Articles ESAIM: Probability and Statistics Year : 2015

Iterative Isotonic Regression

Abstract

This article introduces a new nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a non-decreasing function and a non-increasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main technical result in this paper is the consistency of the proposed estimator when the number of iterations $k_n$ grows appropriately with the sample size $n$. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the well-known consistency property of isotonic regression to the framework of a non-monotone regression function, and secondly, we relate the backfitting algorithm to Von Neumann's algorithm in convex analysis.

Dates and versions

hal-00832863 , version 1 (11-06-2013)

Identifiers

Cite

Arnaud Guyader, Nick Hengartner, Nicolas Jégou, Eric Matzner-Løber. Iterative Isotonic Regression. ESAIM: Probability and Statistics, 2015, 19, pp.1-23. ⟨10.1051/ps/2014012⟩. ⟨hal-00832863⟩
310 View
0 Download

Altmetric

Share

Gmail Facebook X LinkedIn More