NSC Alcohol, Drugs and Impairment Division news Worker Health and Wellness Home and Community Safety & Health Safety Articles mentioned in FSH Instagram posts

Don’t rely on ChatGPT for medication advice, researchers say

holding-prescription-bottle.jpg
Photo: Milko/iStockphoto

If you’re considering asking ChatGPT questions about medicines, researchers recommend double-checking – and even triple-checking – its responses.

A team led by a researcher from Long Island University asked the artificial intelligence tool actual questions that had been posted to the LIU College of Pharmacy drug information system over a recent 16-month period.

In nearly 75% of its responses, the free version of ChatGPT provided incomplete or inaccurate answers, or didn’t answer the question directly. When asked where it found the information, ChatGPT also generated nonexistent citations.

“Health care professionals and patients should be cautious about using ChatGPT as an authoritative source for medication-related information,” said lead study author Sara Grossman, associate professor of pharmacy practice at LIU. “Anyone who uses ChatGPT for medication-related information should verify the information using trusted sources.”

The study was presented in December at the American Society of Health-System Pharmacists’ Midyear Clinical Meeting in Anaheim, CA.

Post a comment to this article

Safety+Health welcomes comments that promote respectful dialogue. Please stay on topic. Comments that contain personal attacks, profanity or abusive language – or those aggressively promoting products or services – will be removed. We reserve the right to determine which comments violate our comment policy. (Anonymous comments are welcome; merely skip the “name” field in the comment box. An email address is required but will not be included with your comment.)