A 60-year-old man with no psychiatric history substituted table salt with sodium bromide for 3 months, following ChatGPT’s advice.
He developed hallucinations, paranoia, and confusion – leading to a 3-week involuntary psychiatric hold.
Physicians diagnosed him with bromism, a rare form of bromide poisoning. After antipsychotic treatment and electrolyte therapy, he recovered and was discharged.
Doctors replicated the ChatGPT query (using version 3.5) and found it suggested bromide without any health warning or request for context.
This incident highlights the serious risks of acting on AI medical advice without consulting healthcare professionals.
Source: Wainberg, Z. et al. (2025). Annals of Internal Medicine
– as cited in Unwaste the Planet


