<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[NPR: “ChatGPT might give you bad medical advice, studies warn”]]></title><description><![CDATA[<p>NPR: “ChatGPT might give you bad medical advice, studies warn”</p><p><a href="https://infosec.exchange/tags/AI" rel="tag">#<span>AI</span></a> <a href="https://infosec.exchange/tags/ChatGPT" rel="tag">#<span>ChatGPT</span></a> <a href="https://infosec.exchange/tags/medicine" rel="tag">#<span>medicine</span></a> <a href="https://infosec.exchange/tags/medical" rel="tag">#<span>medical</span></a> <a href="https://infosec.exchange/tags/health" rel="tag">#<span>health</span></a> </p><p><a href="https://www.npr.org/2026/03/11/nx-s1-5744035/chatgpt-might-give-you-bad-medical-advice-studies-warn" rel="nofollow noopener"><span>https://www.</span><span>npr.org/2026/03/11/nx-s1-57440</span><span>35/chatgpt-might-give-you-bad-medical-advice-studies-warn</span></a></p>]]></description><link>https://postcall.pub/topic/615de7b6-d1e1-41bb-a47b-b2b458ca1ba9/npr-chatgpt-might-give-you-bad-medical-advice-studies-warn</link><generator>RSS for Node</generator><lastBuildDate>Tue, 07 Apr 2026 15:50:06 GMT</lastBuildDate><atom:link href="https://postcall.pub/topic/615de7b6-d1e1-41bb-a47b-b2b458ca1ba9.rss" rel="self" type="application/rss+xml"/><pubDate>Thu, 12 Mar 2026 02:44:10 GMT</pubDate><ttl>60</ttl><item><title><![CDATA[Reply to NPR: “ChatGPT might give you bad medical advice, studies warn” on Thu, 12 Mar 2026 02:48:11 GMT]]></title><description><![CDATA[<p><span><a href="/user/scottwilson%40infosec.exchange">@<span>scottwilson</span></a></span> When I was in PA training, I experimented with using LLMs for studying.</p><p>We had a lecture on blood clotting medications. Can't remember the details, but I asked ChatGPT about using medication in a situation where the blood is too thin... It recommended to use a blood thinning medication in this scenario. My theoretical patient would have fucking died!!! I stopped using it after this.</p><p>I'd like to say LLMs have moved forward since, but glue pizza and rock eating proves otherwise <img src="https://postcall.pub/assets/plugins/nodebb-plugin-emoji/emoji/android/1f644.png?v=cec5bbd054b" class="not-responsive emoji emoji-android emoji--face_with_rolling_eyes" style="height:23px;width:auto;vertical-align:middle" title="🙄" alt="🙄" /> .</p>]]></description><link>https://postcall.pub/post/https://mstdn.social/users/QueerMatters/statuses/116213919995902476</link><guid isPermaLink="true">https://postcall.pub/post/https://mstdn.social/users/QueerMatters/statuses/116213919995902476</guid><dc:creator><![CDATA[queermatters@mstdn.social]]></dc:creator><pubDate>Thu, 12 Mar 2026 02:48:11 GMT</pubDate></item></channel></rss>