ChatGPT Sent Me to the ER

21 pointsposted 12 hours ago
by tedsanders

14 Comments

mwetzler

11 hours ago

my dad has a similar story. the voice of reason can be very helpful for people who take pride in telling themselves “it’s fine”. thanks Chat!

boopity2025

12 hours ago

Wild to think we’ve reached the point where “my AI told me to go to the ER” is a plausible sentence and not the setup to a Black Mirror episode. Pre‑ChatGPT, you’d Google “droopy eyelid” and get a mix of WebMD hypochondria and SEO‑bait wellness blogs. Now you get a differential diagnosis, a list of red flags, and a gentle shove toward not dying.

AI had carotid dissection in mind from the first message, just quietly waiting for the plot to thicken.

Sure, there’s a lot to worry about with AI, but in this case it basically played the role of the one friend who says “you look weird, go to the doctor” and turns out to be right. Which is both comforting and slightly terrifying.

zahlman

11 hours ago

> AI had carotid dissection in mind from the first message

This does not follow from the evidence presented, even if we disregard questions of what "mind" means in this context. It's entirely plausible that the possibility of carotid dissection only made sense to consider partway through the conversation.

mcphage

6 hours ago

I agree about the issue of “mind”, but the rest is mentioned directly in the article:

> When I scrolled to the top of the chat where I’d first reported my neck pain, I saw that ChatGPT had mentioned a possible carotid dissection in its very first response to me. It started as a low probability guess that increased in certainty as my symptoms progressed. I find it striking that it was considering that possibility from my very first message.

hn_throw2025

10 hours ago

I can add to this. About a month ago, a friend had abdominal pains but was reluctant to go to A&E (Emergency Room).

I had my suspicions, but checked them with ChatGPT. The LLM said it was highly likely to be appendicitis, and that he should seek urgent medical attention, and also not eat or drink (other than water) as they may need to operate quite soon.

I passed it on, he went to A&E, and it all played out that way.

I’ve since switched my subscription to Gemini for work related reasons, but it has also been very helpful in my Gastritis recovery as I try to avoid flareups from dietary choices.

A typical HN stance is waiting for this fad to go away, but it certainly does have uses for me (currently being briefed by Gemini on an unfamiliar DIY task).

satisfice

9 hours ago

Anecdotal evidence: the gold standard!

user

8 hours ago

[deleted]

anovikov

11 hours ago

[flagged]

zahlman

11 hours ago

It's late night in North America; you said this less than an hour after the post went up; and plenty of posts get little traction on HN (including submissions of links that later become very popular on a separate submission or from the curated "second chance" queue).

zahlman

11 hours ago

This title is clickbait. The implication ("following ChatGPT advice caused an emergency requiring an ER visit") is nearly the opposite of the central claim made ("ChatGPT encouraged me to go to the ER, and it turned out to be a life-saving decision").

wiseowise

10 hours ago

That’s your interpretation.

When I read the title I thought about positive case [ChatGPT saved my life], not the negative one.

mrjay42

5 hours ago

Asking about the logic here: this is a form of fallacy...kinda?

His "interpretation" or guess, wasn't worse or better than yours.

Therefore, his statement about a misleading title is not invalidated because you guessed in the opposite direction.

Let's say

Hypothesis 1

Article is negative (ChatGPT gave bad medical advice and that led to the E.R.)

Then * His guess -> is correct

* Your guess -> is wrong

Hypothesis 2

Article is positive

Then

* His guess -> is wrong

* Your guess -> is correct

Conclusion:

In any case you had NO way to know beforehand

So, in what ways pointing out that this is "his interpretation" invalidates anything he said?

Of course, it is his guess and based on the title alone, it's at least an equally valid guess as yours.

I say "at least", because it's not unreasonable to think that an LLM might have hallucinated some medical advice and that could lead someone to have an unhealthy practice which led them to the E.R.

zahlman

2 hours ago

> That’s your interpretation.

It's the ordinary understanding of the idiom.

weikju

7 hours ago

Same here. It's very similar to a lot of the "Apple Watch saved my life" stories/headlines.