Contextual Multi-arm Bandit TutorialsΒΆ

These tutorials provide an introductory guide to using AgileRL contextual multi-armed bandit algorithms.

Iris image

NeuralUCB - Iris

Pen digits image

NeuralTS - PenDigits