Fabrication and errors in the bibliographic citations generated by ChatGPT
Abstract Although chatbots such as ChatGPT can facilitate cost-effective text generation and editing, factually incorrect responses (hallucinations) limit their utility. This study evaluates one particular type of hallucination: fabricated bibliographic citations that do not represent actual scholar...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2023-09-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-023-41032-5 |