A decoupled learning framework for contrastive learning

Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples in...

Description complète

Détails bibliographiques
Auteur principal: Xu, Yicheng
Autres auteurs: Lin Zhiping
Format: Thesis-Master by Coursework
Langue:English
Publié: Nanyang Technological University 2022
Sujets:
Accès en ligne:https://hdl.handle.net/10356/163711
_version_ 1826128830657462272
author Xu, Yicheng
author2 Lin Zhiping
author_facet Lin Zhiping
Xu, Yicheng
author_sort Xu, Yicheng
collection NTU
description Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples into account to boost the performance. Meanwhile, the large model size limits the training batch size under fixed device memory. To solve this problem, we propose a Decoupled Updating Contrastive Learning (DUCL) framework 1) to divide a single model into pieces to shrink the model size on each accelerator device and 2) to decouple every batch in CL for memory saving. The combination of both approaches enables a larger negative sample space for contrastive learning models to achieve better performance. As a result, we prove the effectiveness of large batch size and save the memory to a maximum of 43% in our experiments. By incorporating our learning method, the contrastive learning model can be trained with a larger negative sample space thus improving its performance without making any change for the model structure.
first_indexed 2024-10-01T07:31:00Z
format Thesis-Master by Coursework
id ntu-10356/163711
institution Nanyang Technological University
language English
last_indexed 2024-10-01T07:31:00Z
publishDate 2022
publisher Nanyang Technological University
record_format dspace
spelling ntu-10356/1637112022-12-15T12:32:07Z A decoupled learning framework for contrastive learning Xu, Yicheng Lin Zhiping School of Electrical and Electronic Engineering EZPLin@ntu.edu.sg Engineering::Electrical and electronic engineering::Computer hardware, software and systems Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples into account to boost the performance. Meanwhile, the large model size limits the training batch size under fixed device memory. To solve this problem, we propose a Decoupled Updating Contrastive Learning (DUCL) framework 1) to divide a single model into pieces to shrink the model size on each accelerator device and 2) to decouple every batch in CL for memory saving. The combination of both approaches enables a larger negative sample space for contrastive learning models to achieve better performance. As a result, we prove the effectiveness of large batch size and save the memory to a maximum of 43% in our experiments. By incorporating our learning method, the contrastive learning model can be trained with a larger negative sample space thus improving its performance without making any change for the model structure. Master of Science (Computer Control and Automation) 2022-12-15T12:32:07Z 2022-12-15T12:32:07Z 2022 Thesis-Master by Coursework Xu, Y. (2022). A decoupled learning framework for contrastive learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/163711 https://hdl.handle.net/10356/163711 en application/pdf Nanyang Technological University
spellingShingle Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Xu, Yicheng
A decoupled learning framework for contrastive learning
title A decoupled learning framework for contrastive learning
title_full A decoupled learning framework for contrastive learning
title_fullStr A decoupled learning framework for contrastive learning
title_full_unstemmed A decoupled learning framework for contrastive learning
title_short A decoupled learning framework for contrastive learning
title_sort decoupled learning framework for contrastive learning
topic Engineering::Electrical and electronic engineering::Computer hardware, software and systems
url https://hdl.handle.net/10356/163711
work_keys_str_mv AT xuyicheng adecoupledlearningframeworkforcontrastivelearning
AT xuyicheng decoupledlearningframeworkforcontrastivelearning