ESFS: A new embedded feature selection method based on SFS

Abstract : Feature subset selection is an important subject when training classifiers in Machine Learning (ML) problems. Too many input features in a ML problem may lead to the so-called "curse of dimensionality", which describes the fact that the complexity of the classifier parameters adjustment during training increases exponentially with the number of features. Thus, ML algorithms are known to suffer from important decrease of the prediction accuracy when faced with many features that are not necessary. In this paper, we introduce a novel embedded feature selection method, called ESFS, which is inspired from the wrapper method SFS since it relies on the simple principle to add incrementally most relevant features. Its originality concerns the use of mass functions from the evidence theory that allows to merge elegantly the information carried by features, in an embedded way, and so leading to a lower computational cost than original SFS. This approach has successfully been applied to the emergent domain of emotion classification in audio signals.
Complete list of metadatas

Cited literature [13 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01984705
Contributor : Emmanuel Dellandrea <>
Submitted on : Thursday, January 17, 2019 - 11:20:58 AM
Last modification on : Thursday, November 21, 2019 - 2:37:50 AM

File

10.1.1.870.2176.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01984705, version 1

Citation

Zhongzhe Xiao, Emmanuel Dellandrea, Weibei Dou, Liming Chen. ESFS: A new embedded feature selection method based on SFS. [Research Report] Ecole Centrale Lyon; Université de Lyon; LIRIS UMR 5205 CNRS/INSA de Lyon/Université Claude Bernard Lyon 1/Université Lumière Lyon 2/École Centrale de Lyon; Tsinghua University, Bejing, China. 2008. ⟨hal-01984705⟩

Share

Metrics

Record views

63

Files downloads

94