Tag: math.ST

#math.ST

Academic · 1 min

Design Experiments to Compare Multi-armed Bandit Algorithms

arXiv:2603.05919v1 Announce Type: new Abstract: Online platforms routinely compare multi-armed bandit algorithms, such as UCB and Thompson Sampling, to select the best-performing policy. Unlike standard …

Huiling Meng, Ningyuan Chen, Xuefeng Gao
9 views