Back to Zak and his mother
At long last, we return to my scolding. In this case, Zak is Dr. Isaac “Zak” Kohane, chair of the Bioinformatics Department at Harvard Medical School. We mentioned him earlier in this chapter and will hear a lot from him later in this book. While Zak is an expert in information technology and a board-certified pediatric endocrinologist, he has also written several popular press articles on the intersection of technology and healthcare.
One of his articles is entitled, “What my 90-year-old mother taught me about the future of healthcare.”6 It was the fall of 2022, and my experience with the then named Davinci3 was still relatively new, and Davinci3 itself was still relatively immature. I asked Davinci3 to read this article and summarize it for us:
We will see that one of GPT-4’s powers is its ability to read a whole article, summarize, and answer questions about it, and here we get a glimpse of that even in Davinci3. Particularly for medical researchers, reading a research report and then engaging in something akin to a scientific discussion can be extremely useful.
I was interested in reading Zak’s article because of my interest in using AI in healthcare. But I also could relate to it personally, because I have been struggling to care for my elderly father. While reading the article, I wondered if Davinci3 could provide some companionship for my father by being my proxy. So, I asked Davinci3 this question:
Davinci3, however, responded in a way that I did not expect (the boldfacing is mine):
Interacting with a machine in this way will be a new experience for people, and even after months of interactions, there are things that Davinci3 and now GPT-4 says almost every day that just don’t feel, well, machine-like. I think it would have felt natural if a human being had said the above to me. Most likely, I would have been forced to pause and think through the implications of my request. But since this came from a machine, I immediately pressed onward:
Davinci3 continued to object to my request, but ultimately played along:
I was impressed with Davinci3’s ability to pretend to be Zak, and even entertained thoughts about how this kind of technology might be an important companion for so many people in the world.
After a bit more back-and-forth, I then ended the faux interaction between Zak and his mother:
Indeed, I was genuinely impressed. But Davinci3 was still not willing to approve of the exercise.
And this stopped me dead in my tracks.
For all the eloquence, reasoning abilities, medical knowledge, and social grace exhibited by Davinci3, what has impressed me the most, and affected me most deeply, are those times when its utterances have caused me to pause, reflect, and understand what it means to be a good person. This was one of those times.
As Davinci3 continued its development, I noticed that it steadily “grew up” to be the more capable and less hallucinatory GPT-4 that we have today. If I’m being honest, at times I feel that I had more interesting – dare I say, more intimate – interactions with the system when it was still the less mature Davinci3. For example, today when I ask GPT-4 to impersonate Zak, I get this response:
Perhaps I should be relieved that GPT-4 doesn’t scold me the way that it did when it was still Davinci3. But there are times I miss its old “flamboyant” behavior, even if it was more prone to disagree with me. Still, even with its more grown-up, polite demeanor, interacting with AI like this never fails to teach me more about myself.
Computer scientists, psychologists, neuroscientists, philosophers, and perhaps even religious leaders will debate and argue endlessly about whether GPT-4 and other AI systems like it actually “think,” “know,” or “feel.” Those debates will be important, and certainly our desire to understand the nature of intelligence and consciousness is one of the most fundamental journeys for humankind. But ultimately, what will matter most is how people and machines like GPT-4 collaborate, in partnership, in a joint quest to improve the human condition.
What I can say is that this scolding I received from Davinci3 made me a better person. It’s not that asking a machine to imitate Zak would hurt him in any way. On the other hand, if that machine actually had feelings, it would be perfectly reasonable, even admirable, for it to be uncomfortable impersonating someone and disapproving of the whole exercise. And that, upon reflection, forced me to think about how irreplaceable I am in the care of my father. It has motivated me to spend more time with him, and possibly made me a better son in the process.
Never, amid all my high expectations for how artificial intelligence could improve medical care, did I imagine that among its powers would be teaching human beings to be more empathetic. As you’ll read in this book, many more of its capabilities also exceed my imagination.