I’ve been playing with Gemma4, and in one instance it took me something like half an hour to have it acknowledge a statement I made, denying the contents of an official Google page I asked it to search and parse. It lied, and at some point suggested I was hallucinating! If you install it and try it, put it to the test, hard.

  • Chahk@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    17 days ago

    Here’s the thing. Every LLM “answer” is a hallucination, just some happen to be closer to reality than others.