10 comments

  • koito17 14 hours ago ago

    This reminds me of a story where an OCR error[1] likely contaminated training data (and the English language) with the term "vegetative electron microscopy". The article I linked also shows that some journals defended the legitimacy of the terminology.

    I'm not sure if this class of error really counts as a hallucination, but it nonetheless has similar consequences when people fail to validate model outputs.

    [1] https://news.ycombinator.com/item?id=43858655

    • ksaj 13 hours ago ago

      I think the same will happen over time with the AI voice over slop that people don't bother correcting. These include weird pronunciations, missing punctuation that leads to weirdly intonated run-on sentences, pronounced abbreviations like "ickbmm" instead of "icbm", or the opposite, "kay emm ess" instead of "kilometers" and so on.

  • mring33621 15 hours ago ago

    You're right!

    The correct diagnosis for your stated symptoms is that you have a Cloomie in your left Glompus.

    A daily megadose of Ivermectin, over a 7 day period, should resolve your condition.

    • collingreen 14 hours ago ago

      This is a common symptom of consuming the wrong news media or voting for the wrong party. Here are three suggestions that are better ideologically aligned to help you improve your health.

  • kazinator 14 hours ago ago

    > Now imagine your doctor is using an AI model to do the reading. The model says you have a problem with your “basilar ganglia,” [basal meaning at the base, ganglia meaning clusters of neuron cells: neuron clusters at the base of the brain] conflating the two names into an area of the brain that [D]oes [N]ot [E]xist[!] [Dramatic, serious stare into the camera.] You’d hope your doctor would catch the mistake and double-check the scan. But there’s a chance they don’t. [And that brings us to the emergency room, where you are now, a forty-nine software developer presenting with a psychotic obsession for fact-checking everything you read on the Internet.]

  • StrangeSound 14 hours ago ago
  • erelong 14 hours ago ago

    Sounds like just a typo, not "making up a body part"

    • poulpy123 7 hours ago ago

      Computers don't have the biological parts that make typos

    • ksaj 13 hours ago ago

      They are the kinds of "typo" that blew up the space shuttle, for one example of many.

      What if you mix up Ilium and Ileum? How about Malleus and Malleolus? These sound pretty similar, but rather are not.

  • canyp 14 hours ago ago

    The arrogance of calling it a "simple misspelling". We get it; you have commands from above to deploy AI and you're too pathetic to morally question the directive, but at least let's not pretend that LLMs make typos now. "Oh, oopsie, it was just a typo."