Addressing AI Hallucinations for Improved Business Performance
InMoment XI
NOVEMBER 19, 2024
Random hallucination occurs when the model’s output has no connection with the prompt. This information is false since it was the European Southern Observatory’s Very Large Telescope (VLT) that took the first photos of an exoplanet back in 2004! Factual contradiction is when an LLM produces an incorrect answer as fact.
Let's personalize your content