Forced-exploration free Strategies for Unimodal Bandits - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2020

Forced-exploration free Strategies for Unimodal Bandits

Hassan Saber
  • Fonction : Auteur
  • PersonId : 1074055
Pierre Ménard
  • Fonction : Auteur
  • PersonId : 1022182

Résumé

We consider a multi-armed bandit problem specified by a set of Gaussian or Bernoulli distributions endowed with a unimodal structure. Although this problem has been addressed in the literature (Combes and Proutiere, 2014), the state-of-the-art algorithms for such structure make appear a forced-exploration mechanism. We introduce IMED-UB, the first forced-exploration free strategy that exploits the unimodal-structure, by adapting to this setting the Indexed Minimum Empirical Divergence (IMED) strategy introduced by Honda and Takemura (2015). This strategy is proven optimal. We then derive KLUCB-UB, a KLUCB version of IMED-UB, which is also proven optimal. Owing to our proof technique, we are further able to provide a concise finite-time analysis of both strategies in an unified way. Numerical experiments show that both IMED-UB and KLUCB-UB perform similarly in practice and outperform the state-of-the-art algorithms.
Fichier principal
Vignette du fichier
Forced-exploration free Strategies for Unimodal Bandits.pdf (458.25 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02883907 , version 1 (29-06-2020)

Identifiants

Citer

Hassan Saber, Pierre Ménard, Odalric-Ambrym Maillard. Forced-exploration free Strategies for Unimodal Bandits. 2020. ⟨hal-02883907⟩
97 Consultations
93 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More