A story as absurd as it is dramatic. That of a sixty -something man who took a chatgpt response in the word. After using the famous artificial intelligence, man has developed a extremely rare disease: brumism.
A banal question, and the situation slips
As the authors of an article published on the site of the American journal Annals of Internal Medicine explain, it all started when man wondered about his consumption of table salt. Eager to find a healthier alternative to what is also called sodium chloride, he then consulted Chatgpt. This is how the sixty -year -old began to consume sodium bromide for three months. The authors of the article, however, point out that artificial intelligence had well warned man on the fact that sodium bromide can also be used “for other purposes, such as cleaning”. At the beginning of the 20th century, it was also considered sedative and antiepileptic. Today, it is still used in medicine as a sedative and anticonvulsant. In short, sodium bromide is not (at all) made to replace the table salt. And yet …
Treaty for psychosis after consulting Chatgpt
Having taken the letter from Chatgpt to the letter, the sixty -something man finally suffered from poisoning by bromine and his salts, most of the time caused by inhalation in the form of gas or vapor. In this specific case, the man ingested sodium bromide for three months. According to the article, arriving at the hospital, he claimed to have been stuck by his neighbor, presenting a paranoid attitude and refusing to drink the water proposed to him by the nursing staff, although attended. The sixties even tried to escape from the hospital. Finally treated for psychosis, the doctors diagnosed him with bromism, poisoning with sodium bromide, characterized by acne in the face, severe insomnia and excessive thirst. In the case of acute poisoning, nowadays considered to be very rare, nausea, anorexia and vomiting can be observed, as well as hypothermia, tremors, and, in certain more serious cases, a deep coma.
Chatgpt pointed out
If the genesis of this history can be taken lightly, the authors of the article believe that this case demonstrates how “the use of artificial intelligence can potentially contribute to the development of preventable side effects on health.” Especially since, without access to the history of the conversation between man and chatgpt, it is now difficult to determine the exact content of the advice that has been provided by artificial intelligence. Which made the work of doctors all the more complex. After having consulted themselves Chatgpt about table salt, the authors of the article confirmed that the answer included sodium bromide. From now on, they sound the alarm, fearing that other AI applications can “generate scientific inaccuracies, do not allow you to discuss results critically and, in the end, fuel the dissemination of false information”. In conclusion, the article in the American journal recommends the use of AI by doctors, in order to better understand this type of medical case which, if they are still rare today, may well multiply in the future with the increasingly increasing use of Chatgpt.