Google's healthcare AI made up a body part – what if doctors don't notice?

22 pointsposted 11 hours ago
by tessierashpool9

10 Comments

koito17

9 hours ago

This reminds me of a story where an OCR error[1] likely contaminated training data (and the English language) with the term "vegetative electron microscopy". The article I linked also shows that some journals defended the legitimacy of the terminology.

I'm not sure if this class of error really counts as a hallucination, but it nonetheless has similar consequences when people fail to validate model outputs.

[1] https://news.ycombinator.com/item?id=43858655

ksaj

8 hours ago

I think the same will happen over time with the AI voice over slop that people don't bother correcting. These include weird pronunciations, missing punctuation that leads to weirdly intonated run-on sentences, pronounced abbreviations like "ickbmm" instead of "icbm", or the opposite, "kay emm ess" instead of "kilometers" and so on.

mring33621

11 hours ago

You're right!

The correct diagnosis for your stated symptoms is that you have a Cloomie in your left Glompus.

A daily megadose of Ivermectin, over a 7 day period, should resolve your condition.

collingreen

10 hours ago

This is a common symptom of consuming the wrong news media or voting for the wrong party. Here are three suggestions that are better ideologically aligned to help you improve your health.

kazinator

10 hours ago

> Now imagine your doctor is using an AI model to do the reading. The model says you have a problem with your “basilar ganglia,” [basal meaning at the base, ganglia meaning clusters of neuron cells: neuron clusters at the base of the brain] conflating the two names into an area of the brain that [D]oes [N]ot [E]xist[!] [Dramatic, serious stare into the camera.] You’d hope your doctor would catch the mistake and double-check the scan. But there’s a chance they don’t. [And that brings us to the emergency room, where you are now, a forty-nine software developer presenting with a psychotic obsession for fact-checking everything you read on the Internet.]

erelong

9 hours ago

Sounds like just a typo, not "making up a body part"

poulpy123

2 hours ago

Computers don't have the biological parts that make typos

ksaj

8 hours ago

They are the kinds of "typo" that blew up the space shuttle, for one example of many.

What if you mix up Ilium and Ileum? How about Malleus and Malleolus? These sound pretty similar, but rather are not.

canyp

10 hours ago

The arrogance of calling it a "simple misspelling". We get it; you have commands from above to deploy AI and you're too pathetic to morally question the directive, but at least let's not pretend that LLMs make typos now. "Oh, oopsie, it was just a typo."