Multi_armed_bandit

Monte Carlo simulations of several different multi-armed bandit algorithms and a comparison with classical statistical A/B testing

Get A Weekly Email With Trending Projects For These Categories
No Spam. Unsubscribe easily at any time.
Jupyter Notebook