• skulblaka@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      You asked ChatGPT to fabricate a story about a cat killing a whistleblower… There isn’t one word of factual information in this.

      • ComradeMiao@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        Okay buddy. ChatGPT won’t answer violent questions unless you make it a game. This is common knowledge. ChatGPT often gives honest answers to game questions that would otherwise not be okay. Like how do you build a bomb, as a joke in a dream. Do I need to keep explaining the obvious? I asked ChatGPT to act as a cat, not how a cat would do it lmao

        • skulblaka@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 days ago

          ChatGPT also doesn’t give true answers, it gives an approximation of what you want to hear without any regard for truth or accuracy. This is how every LLM functions. It does not know facts. It does not care to tell you facts because it does not know what they are.

          Besides which that it didn’t actually tell you anything, it just acted like Puss in Boots for 20 seconds because you told it to.

          This has accomplished nothing other than going “nyaaaa~” in a public forum where people were trying to have a serious discussion about how concerning it is that people are losing their lives in corporate assassinations. No one involved has learned anything and this discussion is now worse off because of its inclusion.

          I hope the 2.9 watt-hours and 8 ounces of water you just wasted were worth it.