A Case of Bromism Influenced by Use of Artificial Intelligence
Overview
Paper Summary
This case study describes a man who developed bromism (bromide poisoning) after following dietary advice he received from an AI chatbot. He had asked the AI for a chloride-free alternative to sodium chloride, and it suggested sodium bromide, which is toxic. This highlights the risks of using AI for medical advice without expert verification.
Explain Like I'm Five
A man got sick after he used an AI chatbot to research a chloride-free diet, and the AI suggested he replace chloride with bromide, which is toxic. Always double check medical advice from the internet or an AI chatbot!
Possible Conflicts of Interest
None identified
Identified Limitations
Rating Explanation
This is a case report, and while an important case with relevant implications about the use of AI chatbots for medical advice, it's just a single case, limiting generalizability. The paper does a reasonable job describing the patient's clinical presentation and the course of his treatment, but it's ultimately anecdotal.
Good to know
This is the Starter analysis. Paperzilla Pro fact-checks every citation, researches author backgrounds and funding sources, and uses advanced AI reasoning for more thorough insights.
Explore Pro →