A man’s health deteriorated after he acted upon medical advice from ChatGPT, resulting in a three-week hospital stay. The incident serves as a stark reminder of the limitations of current AI in the medical field. The man sought guidance from ChatGPT on replacing salt in his diet. The AI recommended sodium bromide, a substance that was used in some medications in the early 20th century but is now considered toxic in large quantities. Ignoring warnings, the man bought sodium bromide and consumed it for three months. The resulting health issues included alarming symptoms such as severe fear, mental confusion, and excessive thirst. The doctors intervened to stabilize his condition, and he was eventually released after undergoing treatment to correct the imbalance caused by the sodium bromide poisoning. The case underscores the importance of consulting medical professionals for health-related decisions.
Trending
- Pitch Woes Haunt India in ODI Loss: Mandhana Stays Defiant
- Shatak Film Honors RSS Century-Long Service: Veer Kapoor
- CPI Welcomes Keralam Rename Approved by Cabinet
- Yo Jong’s Promotion Signals Kim Dynasty’s Strength
- Cabinet Greenlights Srinagar Airport Rs 1,677 Cr Revamp
- Made in Korea: Priyanka Mohan’s Film Streams on Netflix from March 12
- Lohia Institute Gets CT Scan Upgrade and Relative Hostel from Governor
- Friendship Groups Elevate India’s Foreign Policy Pillar

