An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification
Adaptive boost (AdaBoost) is a prominent example of an ensemble learning algorithm that combines weak classifiers into strong classifiers through weighted majority voting rules. AdaBoost’s weak classifier, with threshold classification, tries to find the best threshold in one of the data dimensions,...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/12/5872 |
_version_ | 1827662711701372928 |
---|---|
author | Yi Ding Hongyang Zhu Ruyun Chen Ronghui Li |
author_facet | Yi Ding Hongyang Zhu Ruyun Chen Ronghui Li |
author_sort | Yi Ding |
collection | DOAJ |
description | Adaptive boost (AdaBoost) is a prominent example of an ensemble learning algorithm that combines weak classifiers into strong classifiers through weighted majority voting rules. AdaBoost’s weak classifier, with threshold classification, tries to find the best threshold in one of the data dimensions, dividing the data into two categories-1 and 1. However, in some cases, this Weak Learning algorithm is not accurate enough, showing poor generalization performance and a tendency to over-fit. To solve these challenges, we first propose a new Weak Learning algorithm that classifies examples based on multiple thresholds, rather than only one, to improve its accuracy. Second, in this paper, we make changes to the weight allocation scheme of the Weak Learning algorithm based on the AdaBoost algorithm to use potential values of other dimensions in the classification process, while the theoretical identification is provided to show its generality. Finally, comparative experiments between the two algorithms on 18 datasets on UCI show that our improved AdaBoost algorithm has a better generalization effect in the test set during the training iteration. |
first_indexed | 2024-03-10T00:31:46Z |
format | Article |
id | doaj.art-a8aed984a23b413b9ffce879322485e5 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-10T00:31:46Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-a8aed984a23b413b9ffce879322485e52023-11-23T15:23:50ZengMDPI AGApplied Sciences2076-34172022-06-011212587210.3390/app12125872An Efficient AdaBoost Algorithm with the Multiple Thresholds ClassificationYi Ding0Hongyang Zhu1Ruyun Chen2Ronghui Li3Maritime College, Guangdong Ocean University, Zhanjiang 524091, ChinaCollege of Mathematics and Computer, Guangdong Ocean University, Zhanjiang 524091, ChinaCollege of Mathematics and Computer, Guangdong Ocean University, Zhanjiang 524091, ChinaMaritime College, Guangdong Ocean University, Zhanjiang 524091, ChinaAdaptive boost (AdaBoost) is a prominent example of an ensemble learning algorithm that combines weak classifiers into strong classifiers through weighted majority voting rules. AdaBoost’s weak classifier, with threshold classification, tries to find the best threshold in one of the data dimensions, dividing the data into two categories-1 and 1. However, in some cases, this Weak Learning algorithm is not accurate enough, showing poor generalization performance and a tendency to over-fit. To solve these challenges, we first propose a new Weak Learning algorithm that classifies examples based on multiple thresholds, rather than only one, to improve its accuracy. Second, in this paper, we make changes to the weight allocation scheme of the Weak Learning algorithm based on the AdaBoost algorithm to use potential values of other dimensions in the classification process, while the theoretical identification is provided to show its generality. Finally, comparative experiments between the two algorithms on 18 datasets on UCI show that our improved AdaBoost algorithm has a better generalization effect in the test set during the training iteration.https://www.mdpi.com/2076-3417/12/12/5872AdaBoostMultiple Thresholds Classificationaccuracygeneralization |
spellingShingle | Yi Ding Hongyang Zhu Ruyun Chen Ronghui Li An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification Applied Sciences AdaBoost Multiple Thresholds Classification accuracy generalization |
title | An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification |
title_full | An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification |
title_fullStr | An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification |
title_full_unstemmed | An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification |
title_short | An Efficient AdaBoost Algorithm with the Multiple Thresholds Classification |
title_sort | efficient adaboost algorithm with the multiple thresholds classification |
topic | AdaBoost Multiple Thresholds Classification accuracy generalization |
url | https://www.mdpi.com/2076-3417/12/12/5872 |
work_keys_str_mv | AT yiding anefficientadaboostalgorithmwiththemultiplethresholdsclassification AT hongyangzhu anefficientadaboostalgorithmwiththemultiplethresholdsclassification AT ruyunchen anefficientadaboostalgorithmwiththemultiplethresholdsclassification AT ronghuili anefficientadaboostalgorithmwiththemultiplethresholdsclassification AT yiding efficientadaboostalgorithmwiththemultiplethresholdsclassification AT hongyangzhu efficientadaboostalgorithmwiththemultiplethresholdsclassification AT ruyunchen efficientadaboostalgorithmwiththemultiplethresholdsclassification AT ronghuili efficientadaboostalgorithmwiththemultiplethresholdsclassification |