- What is GPT-4?
- But does GPT-4 actually know anything about medicine?
- An AI for medical experts and non-experts alike
- A new partnership with AI raises new questions
- Back to Zak and his mother
An AI for medical experts and non-experts alike
I am a computer scientist, not a medical doctor. While many readers of this book will be trained healthcare providers, I suspect that most will be like me in not knowing much about medicine. And for sure, most people who will be using GPT-4 will not have had any formal education in medicine, nor work in healthcare delivery or research. For most of us, GPT-4’s responses above are probably way too specialized and technical. Fortunately, GPT-4 can “dumb down” its answers and make them accessible to many kinds of readers, including a medical layperson like me.
Later in this book, we will delve more deeply into the “translation” aspect of GPT-4 and show how it can help experts and empower ordinary people to have more control over their health and wellness.
Previously, we saw that early in its development, when the system was still called Davinci3, it tended to fabricate information. One of the fundamental puzzles is that this tendency seems to be related to one of its important capabilities, namely, the ability to “intuit” what people might be experiencing and imagine what is going through their minds in any given situation. For example, for our ongoing conversation, we can ask GPT-4 to imagine what a patient with this type of medical issue might be experiencing:
Later in this book, we will see that being able to imagine emotions and perhaps even empathize with people turns out to be one of the most intriguing aspects of GPT-4. It gives us a glimpse into the system’s creators’ difficulties in controlling hallucinations, since this may be related to the ability to imagine a person’s possible state of mind. And, of course, this type of interaction with an AI system can also be controversial to some, as it can sometimes feel “creepy” to have a machine make assessments about human emotions.
But throughout our investigations of healthcare applications of this system, we encountered real-world situations in which a doctor is struggling, not with a puzzling diagnostic case or a difficult treatment decision, nor the crushing burden of clinical paperwork – though we will see that GPT-4 can really help with those things. But perhaps most important of all, GPT-4 somehow finds a way to help doctors with what we might think of as the most human task a doctor faces: how to talk with a patient. GPT-4 often does so with startling clarity and compassion.
Beyond being a conversationalist, beyond being able to reason and solve problems, and beyond possessing medical knowledge, we will see time and again throughout this book that GPT-4 seems able to amplify something about the human experience – our cultures, our emotions, and the importance of social graces. At times, no matter how hard we resist anthropomorphizing an AI system, GPT-4 actually appears to show empathy, becoming a true partner in addressing our healthcare goals.