Improving language model predictions via prompts enriched with knowledge graphs
Despite advances in deep learning and knowledge graphs (KGs), using language models for natural language understanding and question answering remains a challenging task. Pre-trained language models (PLMs) have shown to be able to leverage contextual information, to complete cloze prompts, next sente...
Main Authors: | Brate, R, Dang, M-H, Hoppe, F, He, Y, Meroño-Peñuela, A, Sadashivaiah, V |
---|---|
Format: | Conference item |
Language: | English |
Published: |
CEUR Workshop Proceedings
2023
|
Similar Items
-
Acupuncture and tuina knowledge graph with prompt learning
by: Xiaoran Li, et al.
Published: (2024-04-01) -
ChoCo: a Chord Corpus and a Data Transformation Workflow for Musical Harmony Knowledge Graphs
by: Jacopo de Berardinis, et al.
Published: (2023-09-01) -
Rethinking visual prompting for multimodal large language models with external knowledge
by: Lin, Y, et al.
Published: (2024) -
Simple Knowledge Graph Completion Model Based on Differential Negative Sampling and Prompt Learning
by: Li Duan, et al.
Published: (2023-08-01) -
Improving task generalization via unified schema prompt
by: Wanjun Zhong, et al.
Published: (2023-01-01)