Groupwise Ranking Loss for Multi-Label Learning
This work studies multi-label learning (MLL), where each instance is associated with a subset of positive labels. For each instance, a good multi-label predictor should encourage the predicted positive labels to be close to its ground-truth positive ones. In this work, we propose a new loss, named G...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8970478/ |
_version_ | 1819276418670919680 |
---|---|
author | Yanbo Fan Baoyuan Wu Ran He Bao-Gang Hu Yong Zhang Siwei Lyu |
author_facet | Yanbo Fan Baoyuan Wu Ran He Bao-Gang Hu Yong Zhang Siwei Lyu |
author_sort | Yanbo Fan |
collection | DOAJ |
description | This work studies multi-label learning (MLL), where each instance is associated with a subset of positive labels. For each instance, a good multi-label predictor should encourage the predicted positive labels to be close to its ground-truth positive ones. In this work, we propose a new loss, named Groupwise Ranking LosS (GRLS) for multi-label learning. Minimizing GRLS encourages the predicted relevancy scores of the ground-truth positive labels to be higher than that of the negative ones. More importantly, its time complexity is linear with respect to the number of candidate labels, rather than square complexity for some pairwise ranking based methods. We further analyze GRLS in the perspective of label-wise margin and suggest that multi-label predictor is label-wise effective if and only if GRLS is optimal. We also analyze the relations between GRLS and some widely used loss functions for MLL. Finally, we apply GRLS to multi-label learning, and extensive experiments on several benchmark multi-label databases demonstrate the competitive performance of the proposed method to state-of-the-art methods. |
first_indexed | 2024-12-23T23:39:54Z |
format | Article |
id | doaj.art-b5e0b561fd64449dadb511b669cef310 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-23T23:39:54Z |
publishDate | 2020-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-b5e0b561fd64449dadb511b669cef3102022-12-21T17:25:44ZengIEEEIEEE Access2169-35362020-01-018217172172710.1109/ACCESS.2020.29696778970478Groupwise Ranking Loss for Multi-Label LearningYanbo Fan0https://orcid.org/0000-0002-8530-485XBaoyuan Wu1https://orcid.org/0000-0003-2183-5990Ran He2https://orcid.org/0000-0001-5925-9648Bao-Gang Hu3https://orcid.org/0000-0002-6916-5394Yong Zhang4https://orcid.org/0000-0003-0066-3448Siwei Lyu5https://orcid.org/0000-0002-0992-685XTencent AI Lab, Shenzhen, ChinaTencent AI Lab, Shenzhen, ChinaNational Laboratory of Pattern Recognition, Chinese Academy of Sciences (CASIA), Beijing, ChinaNational Laboratory of Pattern Recognition, Chinese Academy of Sciences (CASIA), Beijing, ChinaTencent AI Lab, Shenzhen, ChinaDepartment of Computer Science, University at Albany, The State University of New York, New York, NY, USAThis work studies multi-label learning (MLL), where each instance is associated with a subset of positive labels. For each instance, a good multi-label predictor should encourage the predicted positive labels to be close to its ground-truth positive ones. In this work, we propose a new loss, named Groupwise Ranking LosS (GRLS) for multi-label learning. Minimizing GRLS encourages the predicted relevancy scores of the ground-truth positive labels to be higher than that of the negative ones. More importantly, its time complexity is linear with respect to the number of candidate labels, rather than square complexity for some pairwise ranking based methods. We further analyze GRLS in the perspective of label-wise margin and suggest that multi-label predictor is label-wise effective if and only if GRLS is optimal. We also analyze the relations between GRLS and some widely used loss functions for MLL. Finally, we apply GRLS to multi-label learning, and extensive experiments on several benchmark multi-label databases demonstrate the competitive performance of the proposed method to state-of-the-art methods.https://ieeexplore.ieee.org/document/8970478/Multi-label learninggroupwise rankingoptimization |
spellingShingle | Yanbo Fan Baoyuan Wu Ran He Bao-Gang Hu Yong Zhang Siwei Lyu Groupwise Ranking Loss for Multi-Label Learning IEEE Access Multi-label learning groupwise ranking optimization |
title | Groupwise Ranking Loss for Multi-Label Learning |
title_full | Groupwise Ranking Loss for Multi-Label Learning |
title_fullStr | Groupwise Ranking Loss for Multi-Label Learning |
title_full_unstemmed | Groupwise Ranking Loss for Multi-Label Learning |
title_short | Groupwise Ranking Loss for Multi-Label Learning |
title_sort | groupwise ranking loss for multi label learning |
topic | Multi-label learning groupwise ranking optimization |
url | https://ieeexplore.ieee.org/document/8970478/ |
work_keys_str_mv | AT yanbofan groupwiserankinglossformultilabellearning AT baoyuanwu groupwiserankinglossformultilabellearning AT ranhe groupwiserankinglossformultilabellearning AT baoganghu groupwiserankinglossformultilabellearning AT yongzhang groupwiserankinglossformultilabellearning AT siweilyu groupwiserankinglossformultilabellearning |