Sequential dimension reduction for learning features of expensive black-box functions

Abstract : Learning a feature of an expensive black-box function (optimum, contour line,...) is a difficult task when the dimension increases. A classical approach is two-stage. First, sensitivity analysis is performed to reduce the dimension of the input variables. Second, the feature is estimated by considering only the selected influential variables. This approach can be computationally expensive and may lack flexibility since dimension reduction is done once and for all. In this paper, we propose a so called Split-and-Doubt algorithm that performs sequentially both dimension reduction and feature oriented sampling. The 'split' step identifies influential variables. This selection relies on new theoretical results on Gaussian process regression. We prove that large correlation lengths of covariance functions correspond to inactive variables. Then, in the 'doubt' step, a doubt function is used to update the subset of influential variables. Numerical tests show the efficiency of the Split-and-Doubt algorithm.
Type de document :
Pré-publication, Document de travail
Liste complète des métadonnées

Littérature citée [34 références]  Voir  Masquer  Télécharger
Contributeur : Malek Ben Salem <>
Soumis le : vendredi 19 janvier 2018 - 12:30:58
Dernière modification le : mardi 30 octobre 2018 - 11:46:02
Document(s) archivé(s) le : jeudi 24 mai 2018 - 08:32:44


split_and_doubt (3).pdf
Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01688329, version 1


Malek Ben Salem, François Bachoc, Olivier Roustant, Fabrice Gamboa, Lionel Tomaso. Sequential dimension reduction for learning features of expensive black-box functions. 2018. 〈hal-01688329v1〉



Consultations de la notice


Téléchargements de fichiers