Sequential dimension reduction for learning features of expensive black-box functions

Abstract : Learning a feature of an expensive black-box function (optimum, contour line,...) is a difficult task when the dimension increases. A classical approach is two-stage. First, sensitivity analysis is performed to reduce the dimension of the input variables. Second, the feature is estimated by considering only the selected influential variables. This approach can be computationally expensive and may lack flexibility since dimension reduction is done once and for all. In this paper, we propose a so called Split-and-Doubt algorithm that performs sequentially both dimension reduction and feature oriented sampling. The 'split' step identifies influential variables. This selection relies on new theoretical results on Gaussian process regression. We prove that large correlation lengths of covariance functions correspond to inactive variables. Then, in the 'doubt' step, a doubt function is used to update the subset of influential variables. Numerical tests show the efficiency of the Split-and-Doubt algorithm.
Document type :
Preprints, Working Papers, ...
Liste complète des métadonnées

Cited literature [34 references]  Display  Hide  Download
Contributor : Malek Ben Salem <>
Submitted on : Friday, January 19, 2018 - 12:30:58 PM
Last modification on : Friday, April 12, 2019 - 4:22:13 PM
Document(s) archivé(s) le : Thursday, May 24, 2018 - 8:32:44 AM


split_and_doubt (3).pdf
Files produced by the author(s)


  • HAL Id : hal-01688329, version 1


Malek Ben Salem, François Bachoc, Olivier Roustant, Fabrice Gamboa, Lionel Tomaso. Sequential dimension reduction for learning features of expensive black-box functions. 2018. ⟨hal-01688329v1⟩



Record views


Files downloads