Heterogeneous Student Knowledge Distillation From BERT Using a Lightweight Ensemble Framework
Deep learning models have demonstrated their effectiveness in capturing complex relationships between input features and target outputs across many different application domains. These models, however, often come with considerable memory and computational demands, posing challenges for deployment on...
Main Authors: | Ching-Sheng Lin, Chung-Nan Tsai, Jung-Sing Jwo, Cheng-Hsiung Lee, Xin Wang |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10458136/ |
Similar Items
-
Knowledge distillation of news text classification based on BERT-CNN
by: Ye Rong, et al.
Published: (2023-01-01) -
Emotion Classification of Indonesian Tweets using BERT Embedding
by: Muhammad Habib Algifari, et al.
Published: (2023-11-01) -
Towards Transfer Learning Techniques—BERT, DistilBERT, BERTimbau, and DistilBERTimbau for Automatic Text Classification from Different Languages: A Case Study
by: Rafael Silva Barbon, et al.
Published: (2022-10-01) -
Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
by: Kai Zhang, et al.
Published: (2024-10-01) -
A Feasible and Explainable Network Traffic Classifier Utilizing DistilBERT
by: Chang-Yui Shin, et al.
Published: (2023-01-01)