A “natural language query” search engine is what I need sometimes.

  • BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    You are missing the point. You don’t have to become a subject expert to verify the information. Not all sources are the same, some are incorrect on purpose, some are incorrect due to lax standards. As a thinking human being, you can decide to trust one source over the other. But LLMs sees all the information they are trained on as 100% correct. So it can generate factually incorrect information while believing what it provided you are 100% factually correct.

    Using LLMs as a shortcut to find something is like playing a Russian roulette, you might get correct information 5 out of 6 times, but that one time is guaranteed to be incorrect.