Asymptotics for Regression Models Under Loss of Identifiability

Abstract : This paper discusses the asymptotic behavior of regression models under general conditions, especially if the dimensionality of the set of true parameters is larger than zero and the true model is not identifiable. Firstly, we give a general inequality for the difference of the sum of square errors (SSE) of the estimated regression model and the SSE of the theoretical true regression function in our model. A set of generalized derivative functions is a key tool in deriving such inequality. Under suitable Donsker condition for this set, we provide the asymptotic distribution for the difference of SSE. We show how to get this Donsker property for parametric models even though the parameters characterizing the best regression function are not unique. This result is applied to neural networks regression models with redundant hidden units when loss of identifiability occurs and gives some hints on how penalizing such models to avoid over-fitting.
Type de document :
Article dans une revue
Sankhya A, Springer Verlag, 2016, 78 (2), pp.155-179
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01520204
Contributeur : Joseph Rynkiewicz <>
Soumis le : mercredi 10 mai 2017 - 09:50:58
Dernière modification le : lundi 27 novembre 2017 - 14:14:02

Identifiants

  • HAL Id : hal-01520204, version 1

Collections

Citation

Joseph Rynkiewicz. Asymptotics for Regression Models Under Loss of Identifiability. Sankhya A, Springer Verlag, 2016, 78 (2), pp.155-179. 〈hal-01520204〉

Partager

Métriques

Consultations de la notice

70