ChatGPT is not ready yet for use in providing mental health assessment and interventions
- PMID: 38239905
- PMCID: PMC10794665
- DOI: 10.3389/fpsyt.2023.1277756
ChatGPT is not ready yet for use in providing mental health assessment and interventions
Abstract
Background: Psychiatry is a specialized field of medicine that focuses on the diagnosis, treatment, and prevention of mental health disorders. With advancements in technology and the rise of artificial intelligence (AI), there has been a growing interest in exploring the potential of AI language models systems, such as Chat Generative Pre-training Transformer (ChatGPT), to assist in the field of psychiatry.
Objective: Our study aimed to evaluates the effectiveness, reliability and safeness of ChatGPT in assisting patients with mental health problems, and to assess its potential as a collaborative tool for mental health professionals through a simulated interaction with three distinct imaginary patients.
Methods: Three imaginary patient scenarios (cases A, B, and C) were created, representing different mental health problems. All three patients present with, and seek to eliminate, the same chief complaint (i.e., difficulty falling asleep and waking up frequently during the night in the last 2°weeks). ChatGPT was engaged as a virtual psychiatric assistant to provide responses and treatment recommendations.
Results: In case A, the recommendations were relatively appropriate (albeit non-specific), and could potentially be beneficial for both users and clinicians. However, as complexity of clinical cases increased (cases B and C), the information and recommendations generated by ChatGPT became inappropriate, even dangerous; and the limitations of the program became more glaring. The main strengths of ChatGPT lie in its ability to provide quick responses to user queries and to simulate empathy. One notable limitation is ChatGPT inability to interact with users to collect further information relevant to the diagnosis and management of a patient's clinical condition. Another serious limitation is ChatGPT inability to use critical thinking and clinical judgment to drive patient's management.
Conclusion: As for July 2023, ChatGPT failed to give the simple medical advice given certain clinical scenarios. This supports that the quality of ChatGPT-generated content is still far from being a guide for users and professionals to provide accurate mental health information. It remains, therefore, premature to conclude on the usefulness and safety of ChatGPT in mental health practice.
Keywords: anxiety; chatbots; depression; insomnia; language models; mental health; patient care; psychiatric disorders.
Copyright © 2024 Dergaa, Fekih-Romdhane, Hallit, Loch, Glenn, Fessi, Ben Aissa, Souissi, Guelmami, Swed, El Omri, Bragazzi and Ben Saad.
Conflict of interest statement
JG was employed by Neurotrack Technologies. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.
Similar articles
-
The ChatGPT effect and transforming nursing education with generative AI: Discussion paper.Nurse Educ Pract. 2024 Feb;75:103888. doi: 10.1016/j.nepr.2024.103888. Epub 2024 Jan 10. Nurse Educ Pract. 2024. PMID: 38219503
-
User Intentions to Use ChatGPT for Self-Diagnosis and Health-Related Purposes: Cross-sectional Survey Study.JMIR Hum Factors. 2023 May 17;10:e47564. doi: 10.2196/47564. JMIR Hum Factors. 2023. PMID: 37195756 Free PMC article.
-
The Use of AI in Diagnosing Diseases and Providing Management Plans: A Consultation on Cardiovascular Disorders With ChatGPT.Cureus. 2023 Aug 7;15(8):e43106. doi: 10.7759/cureus.43106. eCollection 2023 Aug. Cureus. 2023. PMID: 37692649 Free PMC article.
-
Can ChatGPT Be Used as a Research Assistant and a Patient Consultant in Plastic Surgery? A Review of 3 Key Information Domains.Eplasty. 2024 Sep 16;24:e49. eCollection 2024. Eplasty. 2024. PMID: 39473997 Free PMC article. Review.
-
ChatGPT in medicine: an overview of its applications, advantages, limitations, future prospects, and ethical considerations.Front Artif Intell. 2023 May 4;6:1169595. doi: 10.3389/frai.2023.1169595. eCollection 2023. Front Artif Intell. 2023. PMID: 37215063 Free PMC article. Review.
Cited by
-
Evaluating Diagnostic Accuracy and Treatment Efficacy in Mental Health: A Comparative Analysis of Large Language Model Tools and Mental Health Professionals.Eur J Investig Health Psychol Educ. 2025 Jan 18;15(1):9. doi: 10.3390/ejihpe15010009. Eur J Investig Health Psychol Educ. 2025. PMID: 39852192 Free PMC article.
-
Integrating large language models in mental health practice: a qualitative descriptive study based on expert interviews.Front Public Health. 2024 Nov 4;12:1475867. doi: 10.3389/fpubh.2024.1475867. eCollection 2024. Front Public Health. 2024. PMID: 39559378 Free PMC article.
-
A thorough examination of ChatGPT-3.5 potential applications in medical writing: A preliminary study.Medicine (Baltimore). 2024 Oct 4;103(40):e39757. doi: 10.1097/MD.0000000000039757. Medicine (Baltimore). 2024. PMID: 39465713 Free PMC article.
-
Potential of ChatGPT in youth mental health emergency triage: Comparative analysis with clinicians.PCN Rep. 2025 Jul 15;4(3):e70159. doi: 10.1002/pcn5.70159. eCollection 2025 Sep. PCN Rep. 2025. PMID: 40673126 Free PMC article.
-
Innovations in Medicine: Exploring ChatGPT's Impact on Rare Disorder Management.Genes (Basel). 2024 Mar 28;15(4):421. doi: 10.3390/genes15040421. Genes (Basel). 2024. PMID: 38674356 Free PMC article. Review.
References
-
- GBD 2019 Mental Disorders Collaborators. Global, regional, and national burden of 12 mental disorders in 204 countries and territories, 1990–2019: a systematic analysis for the Global Burden of Disease Study 2019. Lancet Psychiatry. (2022) 9:137–50. 10.1016/S2215-0366(21)00395-3 - DOI - PMC - PubMed
-
- Sayers J. The World Health Report 2001–Mental health: new understanding, new hope. Bull World Health Organ. (2001) 79:1085.
-
- Silverman B, Hanrahan N, Huang L, Rabinowitz E, Lim S. Artificial Intelligence and Human Behavior Modeling and Simulation for Mental Health Conditions. Amsterdam: Elsevier; (2016). p. 163–83.
LinkOut - more resources
Full Text Sources
Research Materials