Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy
Abstract Transfer learning is designed to leverage knowledge in the source domain with labels to help build classification models in the target domain where labels are scarce or even unavailable. Previous studies have shown that high-level concepts extracted from original features are more suitable...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2022-09-01
|
Series: | International Journal of Computational Intelligence Systems |
Subjects: | |
Online Access: | https://doi.org/10.1007/s44196-022-00132-2 |
_version_ | 1811208725707358208 |
---|---|
author | Teng Cui Jianhan Pan Mingjing Du Qingyang Zhang |
author_facet | Teng Cui Jianhan Pan Mingjing Du Qingyang Zhang |
author_sort | Teng Cui |
collection | DOAJ |
description | Abstract Transfer learning is designed to leverage knowledge in the source domain with labels to help build classification models in the target domain where labels are scarce or even unavailable. Previous studies have shown that high-level concepts extracted from original features are more suitable for cross-domain classification tasks, so many transfer learning methods transfer knowledge by modeling high-level concepts on the original feature space. However, there are two limitations to this method: First, learning high-level concepts directly on the original feature space will reduce the proportion of shared information contained in common features in the process of knowledge transfer bridge construction. Second, only learning multiple high-level concepts on the original feature space, the latent shared information contained in the domain-specific features cannot be targeted learned, so the latent shared information in the domain-specific features cannot be effectively used. To overcome these limitations, this paper proposes a novel method named Dual-Space Transfer Learning based on an Indirect Mutual Promotion Strategy (DSTL). The DSTL method is formalized as an optimization problem based on non-negative matrix tri-factorization. DSTL first extracts the common features between domains and constructs the common feature space. Then, the learning of the high-level concepts of the common feature space and the original feature space is integrated through an indirect promotion strategy, which can enhance the learning effect of common features and domain-specific features through the mutual help of the two feature spaces. The system test on benchmark data sets shows the superiority of the DSTL method. |
first_indexed | 2024-04-12T04:26:29Z |
format | Article |
id | doaj.art-2b75869181d64935bbd1c5715005788d |
institution | Directory Open Access Journal |
issn | 1875-6883 |
language | English |
last_indexed | 2024-04-12T04:26:29Z |
publishDate | 2022-09-01 |
publisher | Springer |
record_format | Article |
series | International Journal of Computational Intelligence Systems |
spelling | doaj.art-2b75869181d64935bbd1c5715005788d2022-12-22T03:48:04ZengSpringerInternational Journal of Computational Intelligence Systems1875-68832022-09-0115111810.1007/s44196-022-00132-2Dual-Space Transfer Learning Based on an Indirect Mutual Promotion StrategyTeng Cui0Jianhan Pan1Mingjing Du2Qingyang Zhang3School of Computer Science and Technology, Jiangsu Normal UniversitySchool of Computer Science and Technology, Jiangsu Normal UniversitySchool of Computer Science and Technology, Jiangsu Normal UniversitySchool of Computer Science and Technology, Jiangsu Normal UniversityAbstract Transfer learning is designed to leverage knowledge in the source domain with labels to help build classification models in the target domain where labels are scarce or even unavailable. Previous studies have shown that high-level concepts extracted from original features are more suitable for cross-domain classification tasks, so many transfer learning methods transfer knowledge by modeling high-level concepts on the original feature space. However, there are two limitations to this method: First, learning high-level concepts directly on the original feature space will reduce the proportion of shared information contained in common features in the process of knowledge transfer bridge construction. Second, only learning multiple high-level concepts on the original feature space, the latent shared information contained in the domain-specific features cannot be targeted learned, so the latent shared information in the domain-specific features cannot be effectively used. To overcome these limitations, this paper proposes a novel method named Dual-Space Transfer Learning based on an Indirect Mutual Promotion Strategy (DSTL). The DSTL method is formalized as an optimization problem based on non-negative matrix tri-factorization. DSTL first extracts the common features between domains and constructs the common feature space. Then, the learning of the high-level concepts of the common feature space and the original feature space is integrated through an indirect promotion strategy, which can enhance the learning effect of common features and domain-specific features through the mutual help of the two feature spaces. The system test on benchmark data sets shows the superiority of the DSTL method.https://doi.org/10.1007/s44196-022-00132-2Cross-domain text classificationDual-space transfer learningHigh-level conceptsNon-negative matrix tri-factorization |
spellingShingle | Teng Cui Jianhan Pan Mingjing Du Qingyang Zhang Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy International Journal of Computational Intelligence Systems Cross-domain text classification Dual-space transfer learning High-level concepts Non-negative matrix tri-factorization |
title | Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy |
title_full | Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy |
title_fullStr | Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy |
title_full_unstemmed | Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy |
title_short | Dual-Space Transfer Learning Based on an Indirect Mutual Promotion Strategy |
title_sort | dual space transfer learning based on an indirect mutual promotion strategy |
topic | Cross-domain text classification Dual-space transfer learning High-level concepts Non-negative matrix tri-factorization |
url | https://doi.org/10.1007/s44196-022-00132-2 |
work_keys_str_mv | AT tengcui dualspacetransferlearningbasedonanindirectmutualpromotionstrategy AT jianhanpan dualspacetransferlearningbasedonanindirectmutualpromotionstrategy AT mingjingdu dualspacetransferlearningbasedonanindirectmutualpromotionstrategy AT qingyangzhang dualspacetransferlearningbasedonanindirectmutualpromotionstrategy |