Skip to Main content Skip to Navigation

ESFS: A new embedded feature selection method based on SFS

Abstract : Feature subset selection is an important subject when training classifiers in Machine Learning (ML) problems. Too many input features in a ML problem may lead to the so-called "curse of dimensionality", which describes the fact that the complexity of the classifier parameters adjustment during training increases exponentially with the number of features. Thus, ML algorithms are known to suffer from important decrease of the prediction accuracy when faced with many features that are not necessary. In this paper, we introduce a novel embedded feature selection method, called ESFS, which is inspired from the wrapper method SFS since it relies on the simple principle to add incrementally most relevant features. Its originality concerns the use of mass functions from the evidence theory that allows to merge elegantly the information carried by features, in an embedded way, and so leading to a lower computational cost than original SFS. This approach has successfully been applied to the emergent domain of emotion classification in audio signals.
Complete list of metadata

Cited literature [13 references]  Display  Hide  Download
Contributor : Emmanuel Dellandrea Connect in order to contact the contributor
Submitted on : Thursday, January 17, 2019 - 11:20:58 AM
Last modification on : Saturday, June 25, 2022 - 9:52:49 AM

Files produced by the author(s)


  • HAL Id : hal-01984705, version 1


Zhongzhe Xiao, Emmanuel Dellandrea, Weibei Dou, Liming Chen. ESFS: A new embedded feature selection method based on SFS. [Research Report] Ecole Centrale Lyon; Université de Lyon; LIRIS UMR 5205 CNRS/INSA de Lyon/Université Claude Bernard Lyon 1/Université Lumière Lyon 2/École Centrale de Lyon; Tsinghua University, Bejing, China. 2008. ⟨hal-01984705⟩



Record views


Files downloads