• bradd@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    As a side note, I feel like this take is intellectually lazy. A knife cannot be used or handled like a spoon because it’s not a spoon. That doesn’t mean the knife is bad, in fact knives are very good, but they do require more attention and care. LLMs are great at cutting through noise to get you closer to what is contextually relevant, but it’s not a search engine so, like with a knife, you have to be keenly aware of the sharp end when you use it.

    • Nalivai@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 days ago

      LLMs are great at cutting through noise

      Even that is not true. It doesn’t have aforementioned criteria for truth, you can’t make it have one.
      LLMs are great at generating noise that humans have hard time distinguishing from a text. Nothing else. There are indeed applications for it, but due to human nature, people think that since the text looks like something coherent, information contained will also be reliable, which is very, very dangerous.