Multi-armed bandits a simple but very powerful framework for algorithms that make decisions over time under uncertainty. This book provides a more introductory, textbook-like treatment of the subject, covering IID and adversarial rewards, contextual bandits, and connections to economics.