Considering the possibilities and pitfalls of Generative Pre-trained Transformer 3 (GPT-3) in healthcare delivery
- PMID: 34083689
- PMCID: PMC8175735
- DOI: 10.1038/s41746-021-00464-x
Considering the possibilities and pitfalls of Generative Pre-trained Transformer 3 (GPT-3) in healthcare delivery
Abstract
Natural language computer applications are becoming increasingly sophisticated and, with the recent release of Generative Pre-trained Transformer 3, they could be deployed in healthcare-related contexts that have historically comprised human-to-human interaction. However, for GPT-3 and similar applications to be considered for use in health-related contexts, possibilities and pitfalls need thoughtful exploration. In this article, we briefly introduce some opportunities and cautions that would accompany advanced Natural Language Processing applications deployed in eHealth.
Conflict of interest statement
The authors declare no competing interests.
References
-
- Brown, T. B., et al. Language models are few-shot learners. Preprint at https://arxiv.org/abs/2005.14165 (2020).
-
- Turing, A. M. Computing machinery and intelligence. MindLIX, 433–460 (1950).
-
- Lacker, K. Giving GPT-3 a turing test. Available at https://lacker.io/ai/2020/07/06/giving-gpt-3-a-turing-test.html (2020).
-
- Metz, C. Meet GPT-3. It has learned to code (and Blog and Argue). Available at https://www.nytimes.com/2020/11/24/science/artificial-intelligence-ai-gp... (2020).
-
- Scott, K. Microsoft teams up with OpenAI to exclusively license GPT-3 language model. Available at https://blogs.microsoft.com/blog/2020/09/22/microsoft-teams-up-with-open... (2020).
Grants and funding
LinkOut - more resources
Full Text Sources
