The Rate of Convergence of AdaBoost

The AdaBoost algorithm was designed to combine many “weak” hypotheses that perform slightly better than random guessing into a “strong” hypothesis that has very low error. We study the rate at which AdaBoost iteratively converges to the minimum of the “exponential loss”. Unlike previous work, our pr...

Full description

Bibliographic Details
Main Authors: Mukherjee, Indraneel, Rudin, Cynthia, Schapire, Robert E.
Other Authors: Sloan School of Management
Format: Article
Language:en_US
Published: Association for Computing Machinery (ACM) 2013
Online Access:http://hdl.handle.net/1721.1/83258