While I like to think of myself as a supportive and responsive doctor, a recently published study in JAMA makes me think again. According to the findings, ChatGPT was more informative and understanding than a real doctor when answering patient questions. The study compared written responses from physicians and those from ChatGPT to real-world health queries. A panel of licensed healthcare professionals preferred ChatGPT’s responses 79% of the time and rated ChatGPT’s responses as higher quality and more empathetic. Not only was ChatGPT better at providing information, it was also better at caring.
A few months before this study was published, ChatGPT passed all three steps of the medical licensing exam. I studied for weeks for those tests but, in all fairness, the licensing exams cover facts and baseline medical knowledge so it is not that surprising that a large language model did well. But empathy and bedside manner? That’s a whole different story.
Many are already turning to ChatGPT to discuss their mental health. One Reddit user described ChatGPT as “better than my therapist.” She continued, “In a very scary way, I feel HEARD by ChatGPT.”
As good as ChatGPT might be, I am not ready to pass the baton just yet. (Hopefully my patients also feel heard.) That said, I am optimistic that ChatGPT can be used to enhance treatment. There is already evidence it can help assess suicide risk, prevent future depression, improve care and increase access to treatment. Artificial intelligence can help researchers figure out what makes some therapists more effective than others and assist in the training process of new clinicians. It can detect subtle changes in a patient’s speech pattern that might indicate a worsening condition that would otherwise go unnoticed. Driving down burnout in clinicians by drafting clinical documentation and responding to endless emails is yet another way artificial intelligence can be helpful. Harnessing artificial intelligence to diminish the time clinicians spend on burdensome paperwork and increase the time they spend with patients would be a win-win for all involved.
Using artificially intelligent companion robots to reduce loneliness is an area I am particularly excited about. Colleagues Murali Doraiswamy and Elizabeth Broadbent and I just published a paper in Science Robotics highlighting how robots can support health in elderly adults and help them remain at home rather than moving to assisted living. We argue that robots can enhance feelings of connectedness and reduce caregiver burden. Unlike human caregivers, robotic ones never judge, get impatient or frustrated. Robots will never be a substitute for humans but they amplify humanity.
At a conference a few years ago, I encountered Paro, a robot in the form of an adorable baby seal. Apparently the designers chose a seal because it wouldn’t conjure memories of a beloved deceased cat or dog. It is undeniably cute—big eyes, black batting eyelashes and a sunny personality. Paro coos, snuggles, and moves when it recognizes certain words and in response to touch. It learns to behave in a way that pleases the user, remembering and repeating actions that earned it strokes and avoiding behaviors that the user disliked. Paro has been shown to reduce loneliness, agitation, and blood pressure. For patients with dementia, Paro is a friendly companion and has proven to be a reliable mood brightener. I am not proposing that robot companions replace human companions, but robots may provide a bridge between what many isolated individuals need and what society currently provides.
I would love to hear from you about your thoughts about AI therapy and companions. Do you think there is a place for them or do they creep you out?
In the spirit of fairness, I asked ChatGPT for its opinion on the subject.
Me:
Can ChatGPT be useful to assist psychiatrists in the treatment of patients?
ChatGPT:
Yes, GPT-powered chatbots can be useful tools to assist psychiatrists in the treatment of patients. These chatbots can provide support and guidance to patients by engaging in therapeutic conversations, offering coping strategies, answering questions about mental health conditions, and even providing psychoeducation. They can also be helpful in monitoring patients’ mental health between visits and providing reminders for medication or therapy sessions. However, it’s important to note that while chatbots can be useful supplements to traditional therapy, they should not be seen as a replacement for human interaction and personalized care provided by psychiatrists.
Not a bad answer…
Great read. You might want to check out Pi, by inflection. Another AI but I’m pretty sure it’s been trained as a therapist even though the presentation for it is packaged as a personal assistant. I’ve had quite therapeutic conversations with it, and I can tell the way it talks with me is based on CBT.
Great article! Gets right to the benefit of AI for all of us. Just has to be managed and not let take over. Lazy therapists will abuse it but the majority will use safely and again we’ll all benefit.