This monograph provides an overview of bandit algorithms inspired by various aspects of Information Retrieval (IR), such as click models, online ranker evaluation, personalization or the cold-start problem. Using a survey style, each chapter focuses on a specific IR problem and explains how it was addressed with various bandit approaches. Within each section, all the algorithms are presented in chronological order.
The monograph shows how specific concepts related to bandit algorithms. This comprehensive, chronological approach enables the author to explain the impact of IR on the development of new bandit algorithms as well as the impact of bandit algorithms on the development of new methods in IR.
The survey is primarily intended for two groups of readers: researchers in Information Retrieval or Machine Learning and practicing data scientists. It is accessible to anyone who has completed introductory to intermediate level courses in machine learning and/or statistics.