Skip to Main content Skip to Navigation
Conference papers

Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms

Abstract : We investigate multi-armed bandits with budgets, a natural model for ad-display optimization encountered in search engines. We provide asymptotic regret lower bounds satisfied by any algorithm, and propose algorithms which match those lower bounds. We consider different types of budgets: scenarios where the advertiser has a fixed budget over a time horizon, and scenarios where the amount of money that is available to spend is incremented in each time slot. Further, we consider two different pricing models, one in which an advertiser is charged for each time her ad is shown (i.e., for each impression) and one in which the advertiser is charged only if a user clicks on the ad. For all of these cases, we show that it is possible to achieve O(log(T)) regret. For both the cost-per-impression and cost-per-click models, with a fixed budget, we provide regret lower bounds that apply to any uniformly good algorithm. Further, we show that B-KL-UCB, a natural variant of KL-UCB, is asymptotically optimal for these cases. Numerical experiments (based on a real-world data set) further suggest that B-KL-UCB also has the same or better finite-time performance when compared to various previously proposed (UCB-like) algorithms, which is important when applying such algorithms to a real-world problem.
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download
Contributor : Richard Combes <>
Submitted on : Thursday, April 9, 2020 - 6:30:00 PM
Last modification on : Wednesday, April 15, 2020 - 1:48:49 AM


Files produced by the author(s)



Richard Combes, Chong Jiang, Srikant Rayadurgam. Bandits with Budgets: Regret Lower Bounds and Optimal Algorithms. SIGMETRICS 2015, ACM, 2015, Portland, United States. ⟨10.1145/2745844.2745847⟩. ⟨hal-01257889⟩



Record views


Files downloads