Feature Learning Viewpoint of Adaboost and a New Algorithm

The AdaBoost algorithm has the superiority of resisting overfitting. Understanding the mysteries of this phenomenon is a very fascinating fundamental theoretical problem. Many studies are devoted to explaining it from statistical view and margin theory. In this paper, this phenomenon is illustrated...

Full description

Bibliographic Details
Main Authors: Fei Wang, Zhongheng Li, Fang He, Rong Wang, Weizhong Yu, Feiping Nie
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8868178/
Description
Summary:The AdaBoost algorithm has the superiority of resisting overfitting. Understanding the mysteries of this phenomenon is a very fascinating fundamental theoretical problem. Many studies are devoted to explaining it from statistical view and margin theory. In this paper, this phenomenon is illustrated by the proposed AdaBoost+SVM algorithm from feature learning viewpoint, which clearly explains the resistance to overfitting of AdaBoost. Firstly, we adopt the AdaBoost algorithm to learn the base classifiers. Then, instead of directly combining the base classifiers, we regard them as features and input them to SVM classifier. With this, the new coefficient and bias can be obtained, which can be used to construct the final classifier. We explain the rationality of this and illustrate the theorem that when the dimension of these features increases, the performance of SVM would not be worse, which can explain the resistance to overfitting of AdaBoost.
ISSN:2169-3536