Evaluating Adaptive Layer Freezing through Hyperparameter Optimization for Enhanced Fine-Tuning Performance of Language Models
Language models are initially trained on large datasets, enabling them to extract patterns and establish rich contextual connections. When dealing with data scarcity, transfer learning has become the go-to method to use these models in specialized downstream tasks via fine-tuning. However, fine-tuni...
Main Author: | Figueroa, Reinaldo |
---|---|
Other Authors: | Murray, Fiona |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2024
|
Online Access: | https://hdl.handle.net/1721.1/157169 |
Similar Items
-
Automatic hyperparameter tuning of topology optimization algorithms using surrogate optimization
by: Ha, Dat, et al.
Published: (2024) -
Refining malware analysis with enhanced machine learning algorithms using hyperparameter tuning
by: El Mouhtadi, Walid, et al.
Published: (2024) -
Investigating fine-tuning of large language models for text summarisation
by: Khaliq, Usama, et al.
Published: (2024) -
Investigating Fine-Tuning of Language Models for Multiple-Choice Questions
by: Wang, Ivy A.
Published: (2024) -
OSPC: Multimodal Harmful Content Detection using Fine-tuned Language Models
by: Cai, Bill
Published: (2024)