Self-Supervised and Few-Shot Contrastive Learning Frameworks for Text Clustering
Contrastive learning is a promising approach to unsupervised learning, as it inherits the advantages of well-studied deep models without a dedicated and complex model design. In this paper, based on bidirectional encoder representations from transformers (BERT) and long-short term memory (LSTM) neur...
Main Authors: | Haoxiang Shi, Tetsuya Sakai |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10210342/ |
Similar Items
-
MCC: Multi-Cluster Contrastive Semi-Supervised Segmentation Framework for Echocardiogram Videos
by: Yu-Jen Chen, et al.
Published: (2025-01-01) -
A Contrastive Model with Local Factor Clustering for Semi-Supervised Few-Shot Learning
by: Hexiu Lin, et al.
Published: (2023-08-01) -
Dual Learning-Based Safe Semi-Supervised Learning
by: Haitao Gan, et al.
Published: (2018-01-01) -
Self-Supervision and Self-Distillation with Multilayer Feature Contrast for Supervision Collapse in Few-Shot Remote Sensing Scene Classification
by: Haonan Zhou, et al.
Published: (2022-06-01) -
Beyond Supervised Learning in Remote Sensing: A Systematic Review of Deep Learning Approaches
by: Benyamin Hosseiny, et al.
Published: (2024-01-01)