Skip to content

Compassion from a Chatbot?

Compassion from a Chatbot?

I recently began reading Compassionomics, a book citing a ton of data around the role that compassion plays in positively impacting the doctor-patient relationship. While that seems obvious, there are large gaps between what we want to happen and what actually happens when it comes to how much empathy and compassion doctors and care teams have for their patients. 

It’s a good read so far and I’m seeing how this will provide me with more ideas as to how we can continue improving patient experience. The topic of compassion in medicine came across my news feed yesterday and the title definitely caught my attention: When Doctors Use a Chatbot to Improve Their Bedside Manner

I don’t know much about artificial intelligence and ChatGPT and would consider myself to be both curious and a bit frightened by the whole thing (Terminator 2 comes to mind). We’ve seen dire predictions about the future. We’ve also heard predictions that having computers help take over certain difficult and mundane tasks will unleash a new era of innovation and productivity similar to what we have seen with the internet. 

What struck me about this article is how some doctors are using ChatGPT to improve communications with patients. They are asking for help, which I take as a positive. Doctors are notorious for speaking in technical or medical terms, often losing the patient’s attention in the process. In one study of patients who over-consume alcohol, a script using fifth-grade comprehension provided language that was much more easily grasped by patients in terms of understanding treatment options. On the other hand, we also know that these chat programs can get the facts wrong…which has dire consequences in medical care.  

I’m not good at predicting the future, but I will say that if doctors can use ChatGPT as a “coach” to help them learn the words and phrases that will increase both empathy and compassion, that could be a good thing. This won’t likely happen overnight, as doctors are already expected to possess these traits. 

Admitting to using chatbots goes against these expectations, regardless of whether the data in the above-mentioned book clearly indicate that while many doctors believe they show empathy and compassion, their patients largely disagree.  

What the book shows is that greater compassion saves lives. This reminds me of a quote attributed to Theodore Roosevelt:

“People don’t care how much you know until they know how much you care.”

That statement above is indeed bold. It’s what patient experience and the PX Movement is all about!

Leave a Reply