Developers building RAG applications want to overcome the inherent problem of hallucinations in LLMs. In this talk, I'll explain why hallucinations occur, how RAG supports them, and why the HHEM (Hallucination Evaluation Model) can help.
Learn for free, join the best tech learning community
Event notifications, weekly newsletter
Access to all content