• Latest
  • Trending
  • All
  • News
  • Business
  • Lifestyle
Super Fit and Healthy Lawyer 'Died 9 Times' in 1 Night, Survives But Leaves Doctors Confused

Man Accidentally Poisons Himself Following Advice From AI

August 14, 2025
Joe Biden Underwent Surgery To Remove Skin Cancer

Joe Biden Underwent Surgery To Remove Skin Cancer

September 4, 2025

High School Forfeits Entire Football Season Over Massive Financial Scandal

September 4, 2025

The Grocery Store That Costs Five-Figures To Walk In The Door

September 4, 2025
First Lady Leads AI Talk: ‘Our Future is No Longer Science Fiction’

First Lady Leads AI Talk: ‘Our Future is No Longer Science Fiction’

September 4, 2025
Trump To Give Pentagon New Name: Reports

Trump To Give Pentagon New Name: Reports

September 4, 2025

California Officials Urges Residents To Take Precautions Following Wave Of Illness

September 4, 2025
Pennsylvania Couple Accused of Locking 5 Kids in “Feces-Covered Dungeon”

Pennsylvania Couple Accused of Locking 5 Kids in “Feces-Covered Dungeon”

September 4, 2025

The Left Responds To Paramount’s Rumored Deal With Bari Weiss

September 4, 2025
North Carolina Man Accused of Marrying 3 Women in 3 Counties — All at the Same Time

North Carolina Man Accused of Marrying 3 Women in 3 Counties — All at the Same Time

September 4, 2025

Trump Posts A Series Of Memes Online Trolling Schiff, Newsom, and Others

September 4, 2025
Ron DeSantis Gives Giant Middle Finger To ‘Leftist Judge’ Attempting To Shutter Alligator Alcatraz

Ron DeSantis Gives Giant Middle Finger To ‘Leftist Judge’ Attempting To Shutter Alligator Alcatraz

September 4, 2025
Biden’s SEC Chair Gary Gensler Wiped Phone While Probing Wall Street’s Deleted Texts

Biden’s SEC Chair Gary Gensler Wiped Phone While Probing Wall Street’s Deleted Texts

September 4, 2025
  • Donald Trump
  • Tariffs
  • Congress
  • Faith
  • Immigration
Friday, September 5, 2025
  • Login
IJR
  • Politics
  • US News
  • Commentary
  • World News
  • Faith
  • Latest Polls
No Result
View All Result
IJR
No Result
View All Result
Home FaithTap

Man Accidentally Poisons Himself Following Advice From AI

by Trending Newsfeed
August 14, 2025 at 12:27 pm
in FaithTap, News, Wire
250 2
0
Super Fit and Healthy Lawyer 'Died 9 Times' in 1 Night, Survives But Leaves Doctors Confused

(PNC/Getty Images)

491
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

A 60-year-old man’s attempt to improve his health by using an artificial intelligence chatbot for dietary advice ended with a hospital stay after he accidentally poisoned himself, according to a case study published in the Annals of Internal Medicine.

The man had been looking for ways to remove table salt — sodium chloride — from his diet for health reasons. He turned to ChatGPT, a large language model, for guidance. According to the report, the AI suggested sodium bromide as a replacement. While sodium bromide looks like salt, it is toxic for human consumption and is primarily used in cleaning, manufacturing, and agriculture.

For three months, the man used sodium bromide in his food. When he eventually sought medical care, doctors found he had developed bromism, a rare condition caused by long-term exposure to the chemical. Symptoms included fatigue, insomnia, poor coordination, excessive thirst, skin changes, paranoia, and even hallucinations.

Hospital staff noted the man believed his neighbor was trying to poison him. He attempted to leave the hospital at one point and was placed on a psychiatric hold for safety. Treatment included intravenous fluids, electrolyte replacement, and antipsychotic medication. After three weeks of monitoring, he was released.

Researchers involved in the case study said the situation highlights potential risks in using AI for health decisions. They noted that sodium bromide was once used in medicine decades ago but is no longer prescribed for humans in the U.S. It is “highly unlikely,” they wrote, that a medical professional would have recommended it as a salt substitute.

ChatGPT Salt Swap Advice Lands Man in Hospital.

A 60-year-old man was hospitalized after following ChatGPT’s recommendation to replace table salt with sodium bromide, a toxic sedative banned for human use since the 1980s.#AI #ChatGPT #HealthWarning #Toxicity #PublicSafety pic.twitter.com/8RX3ziVzR4

— TechJuice (@TechJuicePk) August 13, 2025

Because the man’s original conversation with ChatGPT was not available, the researchers could not confirm the exact wording or context of the AI’s suggestion. They said large language models, like ChatGPT, are “language prediction tools” that can produce scientifically inaccurate or outdated information and should not replace professional medical judgment.

Dr. Jacob Glanville, CEO of Centivax, said AI systems generate answers by matching patterns in data rather than applying common sense. “This is a classic example of the problem,” he told Fox News Digital, explaining that the model may have recognized sodium bromide as a chemical alternative to sodium chloride in industrial contexts, not food.

Dr. Harvey Castro, an emergency physician and AI expert, stressed that large language models produce text based on statistical patterns, not fact-checking. He cautioned that without regulation and oversight, similar incidents could happen again. He recommended safeguards such as built-in medical databases, risk alerts, and combined human-AI oversight when giving health-related responses.

OpenAI, which developed ChatGPT, told Fox News Digital its system is “not intended for use in the treatment of any health condition” and is “not a substitute for professional advice.” The company said it has safety teams working on reducing risks and trains its systems to encourage users to seek guidance from qualified professionals.

The case serves as a reminder that while AI tools can provide information quickly, they are not a substitute for medical expertise. Experts warn that even when an answer sounds convincing, it may not be safe — and without careful human judgment, the results can be dangerous.

Tags: Trending HeraldU.S. News
Share196Tweet123
Trending Newsfeed

Trending Newsfeed

Advertisements

Top Stories June 10th
Top Stories June 7th
Top Stories June 6th
Top Stories June 3rd
Top Stories May 30th
Top Stories May 29th
Top Stories May 24th
Top Stories May 23rd
Top Stories May 21st
Top Stories May 17th

Join Over 6M Subscribers

We’re organizing an online community to elevate trusted voices on all sides so that you can be fully informed.





IJR

    Copyright © 2024 IJR

Trusted Voices On All Sides

  • About Us
  • GDPR Privacy Policy
  • Terms of Service
  • Editorial Standards & Corrections Policy
  • Subscribe to IJR

Follow Us

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Politics
  • US News
  • Commentary
  • World News
  • Faith
  • Latest Polls

    Copyright © 2024 IJR

Top Stories June 10th Top Stories June 7th Top Stories June 6th Top Stories June 3rd Top Stories May 30th Top Stories May 29th Top Stories May 24th Top Stories May 23rd Top Stories May 21st Top Stories May 17th