A Confirmation of a Conjecture on the Feldman's Two-armed Bandit Problem

06/02/2022
by   Zengjing Chen, et al.
0

Myopic strategy is one of the most important strategies when studying bandit problems. In this paper, we consider the two-armed bandit problem proposed by Feldman. With general distributions and utility functions, we obtain a necessary and sufficient condition for the optimality of the myopic strategy. As an application, we could solve Nouiehed and Ross's conjecture for Bernoulli two-armed bandit problems that myopic strategy stochastically maximizes the number of wins.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset