Generative Pre-Trained Transformer (GPT) in Research: A Systematic Review on Data Augmentation
GPT (Generative Pre-trained Transformer) represents advanced language models that have significantly reshaped the academic writing landscape. These sophisticated language models offer invaluable support throughout all phases of research work, facilitating idea generation, enhancing drafting processe...
Hovedforfatter: | Fahim Sufi |
---|---|
Format: | Article |
Sprog: | English |
Udgivet: |
MDPI AG
2024-02-01
|
Serier: | Information |
Fag: | |
Online adgang: | https://www.mdpi.com/2078-2489/15/2/99 |
Lignende værker
-
Universal skepticism of ChatGPT: a review of early literature on chat generative pre-trained transformer
af: Casey Watters, et al.
Udgivet: (2023-08-01) -
Playing with words: Comparing the vocabulary and lexical diversity of ChatGPT and humans
af: Pedro Reviriego, et al.
Udgivet: (2024-12-01) -
Evaluation of Chat Generative Pre-trained Transformer and Microsoft Copilot Performance on the American Society of Surgery of the Hand Self-Assessment Examinations
af: Taylor R. Rakauskas, BS, et al.
Udgivet: (2025-01-01) -
From Language Models to Medical Diagnoses: Assessing the Potential of GPT-4 and GPT-3.5-Turbo in Digital Health
af: Jonas Roos, et al.
Udgivet: (2024-12-01) -
A Mathematical Investigation of Hallucination and Creativity in GPT Models
af: Minhyeok Lee
Udgivet: (2023-05-01)