Sperling Prostate Center

Medical Information from a Chatbot?

I had a grammar school buddy whose elderly Grandma Bess lived with his family. She was spry and alert, and always interested in the latest appliances, gadgets, etc. Her word for them was newfangled. “My, my,” she would wonder, “What won’t they think of next?”

When it comes to Artificial Intelligence (AI), Grandma Bess would probably say the same if she saw people turning to their computers or smartphones for answers to their questions. It’s not a human who gives the answers, but pretrained AI programs capable of understanding and responding to questions you enter. In addition to being trained on enormous sets of data, depending on the type of programming, they can even educate themselves, accumulating new knowledge in the same way as a human brain. Try it for yourself. Just go to your search engine and ask about any topic, e.g., “Do I have arthritis in my thumb?” In the wink of an eye, you’ll gain not only a clear answer, but also abundant leads to more sources. What won’t they think of next?

These days, so-called chatbots are proliferating faster than the time it takes me to compose this blog. In fact, I road-tested ChatGPT, an open access chatbot contender released in November 2022 by the AI lab founded by Elon Musk (of SpaceX and Tesla fame) and Sam Altman. I asked it to write a blog about chatbots. Here’s an excerpt from the actual 8-paragraph, articulate result generated by ChatGPT:

Chatbots have revolutionized the way businesses interact with their customers. A chatbot is an AI-based computer program designed to simulate conversation with human users, especially over the internet. They are becoming increasingly popular across various industries, including e commerce, healthcare, banking, and many others.

In recent years, chatbots have gained immense popularity because they offer a cost-effective and efficient way for businesses to engage with their customers. Chatbots are available 24/7 and can handle multiple customer requests simultaneously. They can also provide instant responses to common customer queries, reducing the need for human intervention.

…However, chatbots are not without their limitations. They can struggle to understand complex queries, leading to frustration for customers. Additionally, chatbots may not be able to handle unique or unexpected customer requests, leading to the need for human intervention.

Sounds like ChatGPT could replace me as a blogger! To quote humorist Dave Barry, I am not making this up! But, could ChatGPT or similar products replace me as a doctor who strives to give thorough, accurate information to each patient regarding his unique case? According to a March 16, 2023 health article, by using “massive amounts of text data to generate conversational responses to text based queries,” ChatGPT may change the way healthcare consumers seek out clinical topics.

Don’t get me wrong. I acknowledge chatbots are here to stay. I encourage patients to go home and do their research, but with the cautionary reminder that there’s a lot of misinformation and self interest out there. After exploring search engines for information on ablative therapies, a 2019 study out of Columbia University concluded, “There is substantial inaccurate and incomplete information on the Internet regarding ablative treatments for prostate cancer from academic and private practice websites. … More attention to accuracy of information is needed to ensure patients are not misled about the data behind these treatments.”[i] This is why I personally cite published, peer-reviewed data on our website and blogs, but I can’t vouch for others.

Of course, ChatGPT is different from ordinary search engines. GPT stands for Generative Pretrained Transformers, which are capable of pulling ideas out of massive datasets and constructing complex, grammatically correct responses to questions. However, writer Connie Lin points out several possible pitfalls in her article How to Trick OpenAI’s ChatGPT:

  • It may provide a sophisticated-sounding answer, but it may not be the truth.
  • If you ask for information such as recent news, it may report inaccurate facts.
  • Its training only went up to 2021, so it may not be able to access the most current situations.
  • It may not filter out bias.

Speaking of bias, in the interest of full transparency where medical counsel is concerned, I’m definitely biased in favor of person-to-person discussion and shared decision-making between myself and each patient. It’s not that ChatGPT or similar programs aren’t smart. In fact, a multicenter research team tested it with actual questions from the U.S. Medical Licensing Exam—and it passed![i] But, there is more to licensing than textbook learning. ChatGPT never listened to a heartbeat via stethoscope, or determined if a patient’s pacemaker was MRI-compatible, etc. No AI program has ever heard YOUR concerns about the impact of treatment on YOUR sex life, or decided how to do YOUR focal ablation.

I wanted to make an eloquent closing statement about the difference between what AI can offer vs. medical information from an experienced, caring doctor who connects with you as an individual. But then I thought, why not let that newfangled chatbot speak for itself? My, my, what won’t I think of next? I asked it if a chatbot could give medical advice. Here’s the answer straight from ChatGPT:

As an AI language model, I cannot provide medical advice or diagnose medical conditions. Only licensed healthcare professionals are qualified to provide medical advice.

While there are chatbots designed to offer medical advice, it is important to note that such chatbots are not a substitute for medical professionals. These chatbots may be useful in providing general information about medical conditions, symptoms, and treatments, but they cannot provide a comprehensive evaluation of a patient’s health status.

It is important to always consult a licensed healthcare professional for any medical advice or concerns. They are trained to evaluate individual health status, provide a diagnosis, and recommend appropriate treatment.

Thanks for clarifying, ChatGPT. Looks like I still have my job. I could not have said it better myself!

NOTE: This content is solely for purposes of information and does not substitute for diagnostic or medical advice. Talk to your doctor if you are experiencing pelvic pain, or have any other health concerns or questions of a personal medical nature.

[i] Asafu-Adjei D, Mikkilineni N, Sebesta E, Hyams E. Misinformation on the Internet regarding Ablative Therapies for Prostate Cancer. Urology. 2019 Nov;133:182-186.
[ii] Kung TH, Cheatham M, ChatGPT, Medenilla A et al. Performance of ChatGPT on USMLE: Potential for AI-Assisted Medical Education Using Large Language Models. medRxiv 2022.12.19.22283643; doi: https://doi.org/10.1101/2022.12.19.22283643

 

About Dr. Dan Sperling

Dan Sperling, MD, DABR, is a board certified radiologist who is globally recognized as a leader in multiparametric MRI for the detection and diagnosis of a range of disease conditions. As Medical Director of the Sperling Prostate Center, Sperling Medical Group and Sperling Neurosurgery Associates, he and his team are on the leading edge of significant change in medical practice. He is the co-author of the new patient book Redefining Prostate Cancer, and is a contributing author on over 25 published studies. For more information, contact the Sperling Prostate Center.

You may also be interested in...

WordPress Image Lightbox