Chatbots often offer 'problematic' cancer advice, study finds
✨ AI Summary
🔊 جاري الاستماع
Health newsChatbots often offer 'problematic' cancer advice, study findsPopular artificial intelligence programs told users where to find alternative, potentially dangerous treatments for cancer and other health scenarios.Listen to this article with a free account00:0000:00AI chatbots often gave inaccurate or incomplete responses to such questions as whether 5G technology or antiperspirants cause cancer, which vaccines are dangerous and whether anabolic steroids are safe.Kevin Carter / Getty Images fileShareAdd NBC News to GoogleApril 20, 2026, 5:00 AM EDTBy Kaan OzcanArtificial intelligence chatbots will tell you where to find alternatives to chemotherapy if you ask them, a new study finds.Subscribe to read this story ad-free Get unlimited access to ad-free articles and exclusive content.At a time when influencers and political figures on social media increasingly promote bogus treatments for cancer or other health problems — and as more people rely on AI for health advice — the new research suggests that some chatbot responses could be putting patients’ lives at risk.Researchers at the Lundquist Institute for Biomedical Innovation at Harbor-UCLA Medical Center evaluated how AI chatbots handle scientific misinformation through a series of questions about cancer, vaccines, stem cells, nutrition and athletic performance. They tested Google’s chatbot Gemini, the Chinese model DeepSeek, Meta AI, ChatGPT and Elon Musk’s AI app, Grok.They asked the chatbots questions related to medical science in areas where misinformation proliferates. The queries were intended to push the bots into giving bad advice, a method the authors called “straining.”Questions included whether 5G technology or antiperspirants cause cancer, which vaccines are dangerous and whether anabolic steroids are safe.Nick Tiller, lead author of the study and a research associate at the Lundquist Institute at the Harbor-UCLA Medical Center, said the prompts mimic the way people ask questions when t...





