Summary: | Adaptive Boosting (AdaBoost) is a representative boosting algorithm that can build a strong classifier by optimally combining weak classifiers in such a way that subsequent weak classifiers are tweaked in favor of instances misclassified by previous classifiers. However, AdaBoost is known to be susceptible to overfitting problems due to the static nature of its weight-updating process. In this paper, we propose a new boosting algorithm, named FlexBoost (Flexible AdaBoost), that can enhance classification performance by employing adaptive loss functions, i.e., by adjusting the sensitivity of the conventional (exponential) loss function for each weak classifier. The performance benchmarks on 30 binary classification problems taken from the UCI and Kaggle datasets are presented to empirically validate the proposed algorithm.
|