Could asking ChatGPT for diet advice put your health at risk? One man's question led to a hospital visit.
- Replies 0
We’ve all heard the old saying, 'Don’t believe everything you read on the internet.'
But in the age of artificial intelligence, it seems we need to update that to: 'Don’t believe everything a chatbot tells you—especially when it comes to your health!'
We like to think we’re in control when it comes to our health. We listen, we search, we ask—even if it’s not always the right source.
But one case shows what can happen when technology offers advice without context, and a person takes it too literally.
A recent case from the United States has highlighted just how risky it can be to take medical advice from AI tools like ChatGPT.
A 60-year-old man, hoping to make a healthy change to his diet, ended up in hospital after following advice he received from the popular chatbot.

The man in question wanted to cut down on sodium chloride—better known as table salt—after reading about its potential health risks. Like many of us, he turned to the internet for answers.
But instead of consulting a doctor or a reputable health website, he asked ChatGPT how to remove sodium chloride from his diet.
The chatbot, drawing from its vast (but not always accurate) pool of information, suggested that sodium chloride could be swapped for sodium bromide.
The man, inspired by his background in nutrition and eager to experiment, ordered sodium bromide online and began using it in place of table salt for three months. 'Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,' the report noted.
Unfortunately, sodium bromide is not a safe substitute for table salt.
In fact, bromide toxicity—known as bromism—was a well-known medical issue in the early 20th century, when bromide salts were used in over-the-counter medications for insomnia and anxiety.
Too much bromide can cause a range of symptoms, from skin problems to severe neuropsychiatric issues.
After months of consuming sodium bromide, the man began to experience alarming symptoms: paranoia, hallucinations, extreme thirst, and unusual skin changes like facial acne and cherry angiomas.

The man had no previous medical or psychiatric history. Yet within the first 24 hours of his hospital stay, he showed signs of severe psychological distress, including paranoia and hallucinations.
'He was noted to be very thirsty but paranoid about water he was offered,' the report stated.
He was eventually hospitalised, where doctors diagnosed him with bromism. He required three weeks of treatment with fluids and electrolytes before he was well enough to be discharged.
The case, published in the American College of Physicians Journals, serves as a stark warning about the dangers of relying on AI for health advice.
The authors noted that while chatbots like ChatGPT can be helpful for general information, they are not equipped to provide safe, personalised medical guidance.
In fact, they can generate scientific inaccuracies and spread misinformation—sometimes with serious consequences.
It’s tempting to ask Google or ChatGPT about every ache, pain, or dietary question. After all, it’s quick, easy, and doesn’t require an appointment.
But as this case shows, AI tools don’t have the ability to critically assess your unique medical history, symptoms, or needs. They can’t examine you, run tests, or provide the nuanced advice that a real doctor can.
Even OpenAI, the company behind ChatGPT, is clear about this in their Terms of Use: 'You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.'
The company adds that its tools are 'not intended for use in the diagnosis or treatment of any health condition.'
Read more: ‘Easy savings, guys’: Why this AI trick is helping Aussies outsmart checkout prices
Have you ever received questionable health advice online? Or do you have a story about a time when 'Dr. Google' led you astray? We’d love to hear your experiences and tips for finding trustworthy information. Share your thoughts in the comments below!
But in the age of artificial intelligence, it seems we need to update that to: 'Don’t believe everything a chatbot tells you—especially when it comes to your health!'
We like to think we’re in control when it comes to our health. We listen, we search, we ask—even if it’s not always the right source.
But one case shows what can happen when technology offers advice without context, and a person takes it too literally.
A recent case from the United States has highlighted just how risky it can be to take medical advice from AI tools like ChatGPT.
A 60-year-old man, hoping to make a healthy change to his diet, ended up in hospital after following advice he received from the popular chatbot.

A 60-year-old man was hospitalised after following ChatGPT’s advice to substitute table salt (sodium chloride) with sodium bromide, leading to bromide toxicity (bromism). Image source: Solen Feyissa / Unsplash. Disclaimer: This is a stock image used for illustrative purposes only and does not depict the actual person, item, or event described.
The man in question wanted to cut down on sodium chloride—better known as table salt—after reading about its potential health risks. Like many of us, he turned to the internet for answers.
But instead of consulting a doctor or a reputable health website, he asked ChatGPT how to remove sodium chloride from his diet.
The chatbot, drawing from its vast (but not always accurate) pool of information, suggested that sodium chloride could be swapped for sodium bromide.
The man, inspired by his background in nutrition and eager to experiment, ordered sodium bromide online and began using it in place of table salt for three months. 'Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,' the report noted.
Unfortunately, sodium bromide is not a safe substitute for table salt.
In fact, bromide toxicity—known as bromism—was a well-known medical issue in the early 20th century, when bromide salts were used in over-the-counter medications for insomnia and anxiety.
Too much bromide can cause a range of symptoms, from skin problems to severe neuropsychiatric issues.
After months of consuming sodium bromide, the man began to experience alarming symptoms: paranoia, hallucinations, extreme thirst, and unusual skin changes like facial acne and cherry angiomas.

The man experienced serious symptoms including paranoia, hallucinations, facial acne and cherry angiomas due to ingesting sodium bromide for three months. Image source: Tim Mossholder / Unsplash. Disclaimer: This is a stock image used for illustrative purposes only and does not depict the actual person, item, or event described.
The man had no previous medical or psychiatric history. Yet within the first 24 hours of his hospital stay, he showed signs of severe psychological distress, including paranoia and hallucinations.
'He was noted to be very thirsty but paranoid about water he was offered,' the report stated.
He was eventually hospitalised, where doctors diagnosed him with bromism. He required three weeks of treatment with fluids and electrolytes before he was well enough to be discharged.
The case, published in the American College of Physicians Journals, serves as a stark warning about the dangers of relying on AI for health advice.
The authors noted that while chatbots like ChatGPT can be helpful for general information, they are not equipped to provide safe, personalised medical guidance.
In fact, they can generate scientific inaccuracies and spread misinformation—sometimes with serious consequences.
It’s tempting to ask Google or ChatGPT about every ache, pain, or dietary question. After all, it’s quick, easy, and doesn’t require an appointment.
But as this case shows, AI tools don’t have the ability to critically assess your unique medical history, symptoms, or needs. They can’t examine you, run tests, or provide the nuanced advice that a real doctor can.
Even OpenAI, the company behind ChatGPT, is clear about this in their Terms of Use: 'You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.'
The company adds that its tools are 'not intended for use in the diagnosis or treatment of any health condition.'
Read more: ‘Easy savings, guys’: Why this AI trick is helping Aussies outsmart checkout prices
Key Takeaways
- A 60-year-old man was hospitalised after following ChatGPT’s advice to substitute table salt (sodium chloride) with sodium bromide, leading to bromide toxicity (bromism).
- The man experienced serious symptoms including paranoia, hallucinations, facial acne and cherry angiomas due to ingesting sodium bromide for three months.
- The case highlights the dangers of seeking health advice from AI chatbots like ChatGPT, which can produce inaccurate or misleading information.
- OpenAI warns in its Terms of Use that ChatGPT should not be relied upon for factual information or as a substitute for professional medical advice, diagnosis or treatment.
Have you ever received questionable health advice online? Or do you have a story about a time when 'Dr. Google' led you astray? We’d love to hear your experiences and tips for finding trustworthy information. Share your thoughts in the comments below!