• eltrain123@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      People can’t seem to understand that it’s a tool in the early stages of development. If you are treating it as a source of truth, you are missing the point of it entirely. If it tells you something about a person, that is not to be trusted as fact.

      Every bit of information you get from it should be researched and verified. It just gives you a good jumping off point and direction to look based on your prompting. You can drastically improve your results on any subject with good direction, especially something you don’t know a lot about and are starting out in your research. If you are asking it about specific facts you want it to regurgitate, you are going to get bad information.

      If you are claiming damages from something you know gives false information, maybe you should learn how to use the tool before you get your feelings invested, so you can start using it more effectively in your own applications. If you want it to specifically say something that can grab a headline, you can make it do that, it’s just disingenuous and not actually benefiting the conversation, the technology, or the future.

      They have a long way to go to solve AGI, but the benefits to society along the way outpace current tools. At maturity, it has the potential to change major socio-economic structures, but it never gets there if people want to treat it like it has intuition and is trying to hurt them as the technology starts getting stood up.

      • buddascrayon@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        If you’re wondering why you’re getting so many downvotes, it’s because you’re ignoring the fact that the companies that have created these LLMs are passing them off as truth machines by plugging them directly into search engines and then asking everybody to use them as such. It’s not the fault of the people who are trusting these things, it’s the fault of the companies that are creating them and then passing them off as something they’re not. And those companies need to face a reckoning.