Time machine GPT

Large language models (LLMs) are often trained on extensive, temporally indiscriminate text corpora, reflecting the lack of datasets with temporal metadata. This approach is not aligned with the evolving nature of language. Conventional methods for creating temporally adapted language models often d...

全面介紹

書目詳細資料
Main Authors: Drinkall, F, Rahimikia, E, Pierrehumbert, J, Zohren, S
格式: Conference item
語言:English
出版: 2024