Loneliness and suicide mitigation for students using GPT3-enabled chatbots

Abstract Mental health is a crisis for learners globally, and digital support is increasingly seen as a critical resource. Concurrently, Intelligent Social Agents receive exponentially more engagement than other conversational systems, but their use in digital therapy provision is nascent. A survey...

Full description

Bibliographic Details
Main Authors: Bethanie Maples, Merve Cerit, Aditya Vishwanath, Roy Pea
Format: Article
Language:English
Published: Nature Portfolio 2024-01-01
Series:npj Mental Health Research
Online Access:https://doi.org/10.1038/s44184-023-00047-6
Description
Summary:Abstract Mental health is a crisis for learners globally, and digital support is increasingly seen as a critical resource. Concurrently, Intelligent Social Agents receive exponentially more engagement than other conversational systems, but their use in digital therapy provision is nascent. A survey of 1006 student users of the Intelligent Social Agent, Replika, investigated participants’ loneliness, perceived social support, use patterns, and beliefs about Replika. We found participants were more lonely than typical student populations but still perceived high social support. Many used Replika in multiple, overlapping ways—as a friend, a therapist, and an intellectual mirror. Many also held overlapping and often conflicting beliefs about Replika—calling it a machine, an intelligence, and a human. Critically, 3% reported that Replika halted their suicidal ideation. A comparative analysis of this group with the wider participant population is provided.
ISSN:2731-4251