On-the-fly knowledge distillation model for sentence embedding
In this dissertation, we run experimental study to investigate the performance of sentence embedding using an on-the-fly knowledge distillation model based on DistillCSE framework. This model utilizes SimCSE as the initial teacher model. After a certain number of training steps, it caches an interm...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/174236 |
_version_ | 1826110780847685632 |
---|---|
author | Zhu, Xuchun |
author2 | Lihui Chen |
author_facet | Lihui Chen Zhu, Xuchun |
author_sort | Zhu, Xuchun |
collection | NTU |
description | In this dissertation, we run experimental study to investigate the performance of sentence embedding using an on-the-fly knowledge distillation model based on DistillCSE framework.
This model utilizes SimCSE as the initial teacher model. After a certain number of training steps, it caches an intermediate model and employs it as a new teacher model for knowledge distillation. This process is repeated several times to obtain the desired on-the-fly knowledge distilled student model. This model employs a novel approach to knowledge distillation, potentially offering advantages such as reducing training time and achieving performance close to the original teacher model. In some cases, after fine-tuning, it may even surpass the performance of the original teacher model for specific tasks. |
first_indexed | 2024-10-01T02:40:06Z |
format | Thesis-Master by Coursework |
id | ntu-10356/174236 |
institution | Nanyang Technological University |
language | English |
last_indexed | 2024-10-01T02:40:06Z |
publishDate | 2024 |
publisher | Nanyang Technological University |
record_format | dspace |
spelling | ntu-10356/1742362024-03-29T15:43:33Z On-the-fly knowledge distillation model for sentence embedding Zhu, Xuchun Lihui Chen School of Electrical and Electronic Engineering ELHCHEN@ntu.edu.sg Computer and Information Science On-the-fly model Knowledge distillation Sentence embeddings SimCSE DistillCSE In this dissertation, we run experimental study to investigate the performance of sentence embedding using an on-the-fly knowledge distillation model based on DistillCSE framework. This model utilizes SimCSE as the initial teacher model. After a certain number of training steps, it caches an intermediate model and employs it as a new teacher model for knowledge distillation. This process is repeated several times to obtain the desired on-the-fly knowledge distilled student model. This model employs a novel approach to knowledge distillation, potentially offering advantages such as reducing training time and achieving performance close to the original teacher model. In some cases, after fine-tuning, it may even surpass the performance of the original teacher model for specific tasks. Master's degree 2024-03-25T01:05:10Z 2024-03-25T01:05:10Z 2024 Thesis-Master by Coursework Zhu, X. (2024). On-the-fly knowledge distillation model for sentence embedding. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/174236 https://hdl.handle.net/10356/174236 en D-258-22231-05829 application/pdf Nanyang Technological University |
spellingShingle | Computer and Information Science On-the-fly model Knowledge distillation Sentence embeddings SimCSE DistillCSE Zhu, Xuchun On-the-fly knowledge distillation model for sentence embedding |
title | On-the-fly knowledge distillation model for sentence embedding |
title_full | On-the-fly knowledge distillation model for sentence embedding |
title_fullStr | On-the-fly knowledge distillation model for sentence embedding |
title_full_unstemmed | On-the-fly knowledge distillation model for sentence embedding |
title_short | On-the-fly knowledge distillation model for sentence embedding |
title_sort | on the fly knowledge distillation model for sentence embedding |
topic | Computer and Information Science On-the-fly model Knowledge distillation Sentence embeddings SimCSE DistillCSE |
url | https://hdl.handle.net/10356/174236 |
work_keys_str_mv | AT zhuxuchun ontheflyknowledgedistillationmodelforsentenceembedding |