Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model

Abstract : Mixture of experts (MoE) models are a class of artificial neural networks that can be used for functional approximation and probabilistic modeling. An important class of MoE models is the class of mixture of linear experts (MoLE) models, where the expert functions map to real topo-logical output spaces. Recently, Gaussian gated MoLE models have become popular in applied research. There are a number of powerful approximation results regarding Gaussian gated MoLE models, when the output space is univariate. These results guarantee the ability of Gaussian gated MoLE mean functions to approximate arbitrary continuous functions, and Gaussian gated MoLE models themselves to approximate arbitrary conditional probability density functions. We utilize and extend upon the univariate approximation results in order to prove a pair of useful results for situations where the output spaces are multivariate. We do this by proving a pair of lemmas regarding the combination of univariate MoLE models, which are interesting in their own rights.
Document type :
Journal articles
Complete list of metadatas

Cited literature [48 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02265793
Contributor : Florence Forbes <>
Submitted on : Monday, August 12, 2019 - 1:52:40 PM
Last modification on : Monday, August 19, 2019 - 1:11:05 PM

File

20190513_Manuscript_R1.pdf
Files produced by the author(s)

Identifiers

Citation

Hien Nguyen, Faicel Chamroukhi, Florence Forbes. Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model. Neurocomputing, Elsevier, In press, ⟨10.1016/j.neucom.2019.08.014⟩. ⟨hal-02265793⟩

Share

Metrics

Record views

89

Files downloads

11