{"id":49683,"date":"2026-02-11T17:19:38","date_gmt":"2026-02-11T17:19:38","guid":{"rendered":"https:\/\/eduzim.co.zw\/news\/?p=49683"},"modified":"2026-02-11T17:19:38","modified_gmt":"2026-02-11T17:19:38","slug":"oxford-study-warns-ai-chatbots-can-give-dangerous-medical-advice","status":"publish","type":"post","link":"https:\/\/eduzim.co.zw\/news\/2026\/02\/11\/oxford-study-warns-ai-chatbots-can-give-dangerous-medical-advice\/","title":{"rendered":"Oxford Study Warns AI Chatbots Can Give Dangerous Medical Advice"},"content":{"rendered":"<p>\n<\/p>\n<div>\n<p>Using artificial intelligence chatbots to seek medical advice can be dangerous because of their tendency to provide inaccurate and inconsistent information, according to a new study by researchers at the University of Oxford.<\/p>\n<p>The research, led by scientists from the Oxford Internet Institute and the Nuffield Department of Primary Care Health Sciences, was published in the journal <em>Nature Medicine<\/em>.<\/p>\n<p>Researchers asked nearly 1,300 participants to identify possible health conditions and recommend next steps across a range of medical scenarios. Some participants used large language model software to generate potential diagnoses and advice, while others relied on traditional methods, including consulting a general practitioner.<\/p>\n<p>After evaluating the responses, the researchers found that AI tools often delivered a mix of accurate and inaccurate information that users struggled to differentiate. The study concluded that while chatbots can perform well on standardized medical knowledge tests, their use as a clinical support tool poses risks to individuals seeking help for personal symptoms.<\/p>\n<p>Dr. Rebecca Payne, a co-author of the study and a general practitioner, said the findings show that AI is not yet ready to replace physicians.<\/p>\n<p>\u201cPatients need to be aware that asking a large language model about their symptoms can be dangerous,\u201d Payne said. \u201cThese systems can give incorrect diagnoses and fail to recognize when urgent medical attention is required.\u201d<\/p>\n<p>The researchers noted that the challenge lies not only in medical knowledge but also in human interaction. Lead author Andrew Bean of the Oxford Internet Institute said that even top-performing large language models struggle in real-world interactions involving nuanced, high-stakes decisions.<\/p>\n<p>\u201cThese findings highlight the difficulty of building AI systems that can genuinely support people in sensitive areas like health,\u201d Payne said.<\/p>\n<p>Bean said the research aims to contribute to the development of safer and more reliable AI systems, particularly as interest in using chatbots for health advice continues to grow.<\/p>\n<\/p><\/div>\n<p>\n<script data-jetpack-boost=\"ignore\" async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-1669381584671856\"\r\n     crossorigin=\"anonymous\"><\/script>\r\n<!-- Africa tv video display -->\r\n<ins class=\"adsbygoogle\"\r\n     style=\"display:block\"\r\n     data-ad-client=\"ca-pub-1669381584671856\"\r\n     data-ad-slot=\"3579572842\"\r\n     data-ad-format=\"auto\"\r\n     data-full-width-responsive=\"true\"><\/ins>\r\n<script data-jetpack-boost=\"ignore\">\r\n     (adsbygoogle = window.adsbygoogle || []).push({});\r\n<\/script><br \/>\n#Oxford #Study #Warns #Chatbots #Give #Dangerous #Medical #Advice<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Using artificial intelligence chatbots to seek medical advice can be dangerous because of their tendency to provide inaccurate and inconsistent&hellip;<\/p>\n","protected":false},"author":1,"featured_media":49684,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[32,11],"tags":[3700,9338,9663,1009,5057,9578,2471,787],"class_list":["post-49683","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-mzansi","category-world","tag-advice","tag-chatbots","tag-dangerous","tag-give","tag-medical","tag-oxford","tag-study","tag-warns"],"_links":{"self":[{"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/posts\/49683","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/comments?post=49683"}],"version-history":[{"count":1,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/posts\/49683\/revisions"}],"predecessor-version":[{"id":49685,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/posts\/49683\/revisions\/49685"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/media\/49684"}],"wp:attachment":[{"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/media?parent=49683"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/categories?post=49683"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/eduzim.co.zw\/news\/wp-json\/wp\/v2\/tags?post=49683"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}