Adapting to the Long Tail: A Meta-Analysis of Transfer Learning Research for Language Understanding Tasks
AbstractNatural language understanding (NLU) has made massive progress driven by large benchmarks, but benchmarks often leave a long tail of infrequent phenomena underrepresented. We reflect on the question: Have transfer learning methods sufficiently addressed the poor performance o...
Main Authors: | Aakanksha Naik, Jill Lehman, Carolyn Rosé |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2022-01-01
|
Series: | Transactions of the Association for Computational Linguistics |
Online Access: | https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00500/112917/Adapting-to-the-Long-Tail-A-Meta-Analysis-of |
Similar Items
-
Adaptive parsing : self-extending natural language interfaces /
by: 260952 Lehman, Jill Fain
Published: (1992) -
Comparison between long-tailed macaques and humans, the strength of task specialization in relation to task complexity.
by: Goh, Colleen.
Published: (2011) -
Mixed Mutual Transfer for Long-Tailed Image Classification
by: Ning Ren, et al.
Published: (2024-10-01) -
Multi-task learning approach for utilizing temporal relations in natural language understanding tasks
by: Chae-Gyun Lim, et al.
Published: (2023-05-01) -
Dynamic transfer learning with progressive meta-task scheduler
by: Jun Wu, et al.
Published: (2022-11-01)