Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms

Abstract : We investigate multi-armed bandits with budgets, a natural model for ad-display optimization encountered in search engines. We provide asymptotic regret lower bounds satisfied by any algorithm, and propose algorithms which match those lower bounds. We consider different types of budgets: scenarios where the advertiser has a fixed budget over a time horizon, and scenarios where the amount of money that is available to spend is incremented in each time slot. Further, we consider two different pricing models, one in which an advertiser is charged for each time her ad is shown (i.e., for each impression) and one in which the advertiser is charged only if a user clicks on the ad. For all of these cases, we show that it is possible to achieve O(log(T)) regret. For both the cost-per-impression and cost-per-click models, with a fixed budget, we provide regret lower bounds that apply to any uniformly good algorithm. Further, we show that B-KL-UCB, a natural variant of KL-UCB, is asymptotically optimal for these cases. Numerical experiments (based on a real-world data set) further suggest that B-KL-UCB also has the same or better finite-time performance when compared to various previously proposed (UCB-like) algorithms, which is important when applying such algorithms to a real-world problem.
Complete list of metadatas
Contributor : Richard Combes <>
Submitted on : Monday, January 18, 2016 - 1:46:22 PM
Last modification on : Thursday, August 1, 2019 - 2:12:06 PM


  • HAL Id : hal-01257889, version 1


Richard Combes, Chong Jiang, Srikant Rayadurgam. Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms. SIGMETRICS 2015, ACM, 2015, Portland, United States. ⟨hal-01257889⟩



Record views