Adaptive FH-SVM for Imbalanced Classification
Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it appli...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8835917/ |
_version_ | 1829514435275784192 |
---|---|
author | Qi Wang Yingjie Tian Dalian Liu |
author_facet | Qi Wang Yingjie Tian Dalian Liu |
author_sort | Qi Wang |
collection | DOAJ |
description | Support vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight of loss to each class. Recently, Focal Loss has been widely used for deep learning to address the imbalanced datasets. The significant effectiveness of Focal loss attracts the attention in many fields, such as object detection, semantic segmentation. Inspired by Focal loss, we reconstructed Hinge Loss with the scaling factor of Focal loss, called FH Loss, which not only deals with the class imbalance problems but also preserve the distinctive property of Hinge loss. Owing to the difficulty of the trade-off between positive and negative accuracy in imbalanced classification, FH loss pays more attention on minority class and misclassified instances to improve the accuracy of each class, further to reduce the influence of imbalance. In addition, due to the difficulty of solving SVM with FH loss, we propose an improved model with modified FH loss, called Adaptive FH-SVM. The algorithm solves the optimization problem iteratively and adaptively updates the FH loss of each instance. Experimental results on 31 binary imbalanced datasets demonstrate the effectiveness of our proposed method. |
first_indexed | 2024-12-16T13:17:44Z |
format | Article |
id | doaj.art-c5cb11d1873448cbbcb4aa5b5a07a1d4 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-16T13:17:44Z |
publishDate | 2019-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-c5cb11d1873448cbbcb4aa5b5a07a1d42022-12-21T22:30:26ZengIEEEIEEE Access2169-35362019-01-01713041013042210.1109/ACCESS.2019.29409838835917Adaptive FH-SVM for Imbalanced ClassificationQi Wang0Yingjie Tian1Dalian Liu2School of Mathematical Sciences, University of Chinese Academy of Science, Beijing, ChinaResearch Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing, ChinaDepartment of Basic Course Teaching, Beijing Union University, Beijing, ChinaSupport vector machines (SVMs), powerful learning methods, have been popular among machine learning researches due to their strong performance on both classification and regression problems. However, traditional SVM making use of Hinge Loss cannot deal with class imbalance problems, because it applies the same weight of loss to each class. Recently, Focal Loss has been widely used for deep learning to address the imbalanced datasets. The significant effectiveness of Focal loss attracts the attention in many fields, such as object detection, semantic segmentation. Inspired by Focal loss, we reconstructed Hinge Loss with the scaling factor of Focal loss, called FH Loss, which not only deals with the class imbalance problems but also preserve the distinctive property of Hinge loss. Owing to the difficulty of the trade-off between positive and negative accuracy in imbalanced classification, FH loss pays more attention on minority class and misclassified instances to improve the accuracy of each class, further to reduce the influence of imbalance. In addition, due to the difficulty of solving SVM with FH loss, we propose an improved model with modified FH loss, called Adaptive FH-SVM. The algorithm solves the optimization problem iteratively and adaptively updates the FH loss of each instance. Experimental results on 31 binary imbalanced datasets demonstrate the effectiveness of our proposed method.https://ieeexplore.ieee.org/document/8835917/Focal losshinge lossclass imbalancesupport vector machines (SVMs) |
spellingShingle | Qi Wang Yingjie Tian Dalian Liu Adaptive FH-SVM for Imbalanced Classification IEEE Access Focal loss hinge loss class imbalance support vector machines (SVMs) |
title | Adaptive FH-SVM for Imbalanced Classification |
title_full | Adaptive FH-SVM for Imbalanced Classification |
title_fullStr | Adaptive FH-SVM for Imbalanced Classification |
title_full_unstemmed | Adaptive FH-SVM for Imbalanced Classification |
title_short | Adaptive FH-SVM for Imbalanced Classification |
title_sort | adaptive fh svm for imbalanced classification |
topic | Focal loss hinge loss class imbalance support vector machines (SVMs) |
url | https://ieeexplore.ieee.org/document/8835917/ |
work_keys_str_mv | AT qiwang adaptivefhsvmforimbalancedclassification AT yingjietian adaptivefhsvmforimbalancedclassification AT dalianliu adaptivefhsvmforimbalancedclassification |