Pretrained Transformers for Text Ranking: BERT and Beyond
Main Author: | Suzan Verberne |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2023-03-01
|
Series: | Computational Linguistics |
Online Access: | http://dx.doi.org/10.1162/coli_r_00468 |
Similar Items
-
SAPBERT: Speaker-Aware Pretrained BERT for Emotion Recognition in Conversation
by: Seunguook Lim, et al.
Published: (2022-12-01) -
Transforming the generative pretrained transformer into augmented business text writer
by: Faisal Khalil, et al.
Published: (2022-11-01) -
DagoBERT: generating derivational morphology with a pretrained language model
by: Hofmann, V, et al.
Published: (2020) -
Pretrained Natural Language Processing Model for Intent Recognition (BERT-IR)
by: Vasima Khan, et al.
Published: (2021-11-01) -
The classification of PubMed neurosurgical abstracts using pretrained BERT model
by: G. Danilov, et al.
Published: (2021-01-01)