This project studies empirically and theoretically the behavior of AdaBoost, a practical and very popular machine-learning algorithm with some strong theoretical guarantees. Leo Breiman once called AdaBoost the best off-the-shelf classifier for a wide variety of datasets [Breiman, 1999]. AdaBoost is still widely used because of its simplicity, speed, and theoretical guarantees for good performance. However, despite its overwhelming popularity, there is still a mystery surrounding its generalization performance [Mease and Wyner, 2008]. Breiman  considered the question of why AdaBoost performs so well in general as "the most important open problem in machine learning."