MTBERT-Attention: An Explainable BERT Model based on Multi-Task Learning for Cognitive Text Classification
In recent years, there has been a lot of focus on Bloom’s taxonomy-based classification of E-Learning materials. Researchers employ different methods and features. In our previous works, we have dealt with this problem via different techniques and algorithms. we started by boosting traditional machi...
Main Authors: | Hanane Sebbaq, Nour-eddine El Faddouli |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2023-09-01
|
Series: | Scientific African |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2468227623002557 |
Similar Items
-
Fine-tuned BERT Model for Large Scale and Cognitive Classification of MOOCs
by: Hanane Sebbaq, et al.
Published: (2022-05-01) -
Research on text emotion classification based on improved BERT-BiGRU model
by: Li Yun, et al.
Published: (2023-02-01) -
A Hybrid BERT Model That Incorporates Label Semantics via Adjustive Attention for Multi-Label Text Classification
by: Linkun Cai, et al.
Published: (2020-01-01) -
Joint Learning With BERT-GCN and Multi-Attention for Event Text Classification and Event Assignment
by: Xiangrong She, et al.
Published: (2022-01-01) -
Improving BERT With Self-Supervised Attention
by: Yiren Chen, et al.
Published: (2021-01-01)