Paper Summary
Paperzilla title
Man Poisons Himself Following ChatGPT's Bad Diet Advice (n=1)
This case study describes a man who developed bromism (bromide poisoning) after following dietary advice he received from an AI chatbot. He had asked the AI for a chloride-free alternative to sodium chloride, and it suggested sodium bromide, which is toxic. This highlights the risks of using AI for medical advice without expert verification.
Possible Conflicts of Interest
None identified
Identified Weaknesses
Case study, limited generalizability
This is a single case study, so the results can't be generalized to other people or other situations. There's no way to know if similar cases have happened that weren't reported.
Lack of record of patient interaction with AI
The authors don't have a record of the patient's interaction with ChatGPT, so it's not possible to determine with certainty what the AI recommended. This weakens the causal link between the AI advice and the patient's bromism.
Unclear how representative interaction with AI chatbot was
While the authors did test ChatGPT, it's not clear how representative their test was of what the patient actually did. Different queries to ChatGPT can yield different results.
Rating Explanation
This is a case report, and while an important case with relevant implications about the use of AI chatbots for medical advice, it's just a single case, limiting generalizability. The paper does a reasonable job describing the patient's clinical presentation and the course of his treatment, but it's ultimately anecdotal.
Good to know
This is our free standard analysis. Paperzilla Pro fact-checks every citation, researches author backgrounds and funding sources, and uses advanced AI reasoning for more thorough insights.
File Information
Original Title:
A Case of Bromism Influenced by Use of Artificial Intelligence
Uploaded:
August 11, 2025 at 06:36 PM
© 2025 Paperzilla. All rights reserved.