DARE: Distill and Reinforce Ensemble Neural Networks for Climate-Domain Processing
Natural-language processing is well positioned to help stakeholders study the dynamics of ambiguous Climate Change-related (CC) information. Recently, deep neural networks have achieved good results on a variety of NLP tasks depending on high-quality training data and complex and exquisite framework...
Main Authors: | Kun Xiang, Akihiro Fujii |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-04-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/25/4/643 |
Similar Items
-
Unsupervised domain adaptation for lip reading based on cross-modal knowledge distillation
by: Yuki Takashima, et al.
Published: (2021-12-01) -
Transferable adversarial masked self-distillation for unsupervised domain adaptation
by: Yuelong Xia, et al.
Published: (2023-05-01) -
Multilayer Semantic Features Adaptive Distillation for Object Detectors
by: Zhenchang Zhang, et al.
Published: (2023-09-01) -
A New Knowledge-Distillation-Based Method for Detecting Conveyor Belt Defects
by: Qi Yang, et al.
Published: (2022-10-01) -
Layer-Level Knowledge Distillation for Deep Neural Network Learning
by: Hao-Ting Li, et al.
Published: (2019-05-01)