• Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 days ago

    I’ve seen maybe 4 articles like this vs the hundreds of millions that use it everyday. I think the ratio of suicide vs legitimate use of rope is higher actually. And no, being told bad things by a jailbroken chatbot is not the same as being shot.

    • BussyGyatt@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 days ago

      i didn’t say they were the same, you put those words in my mouth. I put them both in the category of things that need regulation in a way that rope does not. are you seriously of the opinion that it is fine and good that people are using their ai chatbots for mental healthcare? are you going to pretend to me that it’s actually good and normal for a human psychology to have every whim or fantasy unceasingly flattered?

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        24 days ago

        I put them both in the category of things that need regulation in a way that rope does not

        My whole point since the beginning is that this is dumb, hence my comment when you essentially said shooting projectiles and saying bad things were the same. Call me when someone shoots up a school with AI. Guns and AI are clearly not in the same category.

        And yes, I think people should be able to talk to their chatbot about their issues and problems. It’s not a good idea to treat it as a therapist but it’s a free country. The only solution would be massive censorship and banning local open source AI, when it’s very censored already (hence the need for jailbreak to have it say anything sexual, violent or on the subject of suicide).

        Think for a second about what you are asking and what it implies.

        • BussyGyatt@feddit.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          24 days ago

          How about I call you when a person kills themself and writes their fucking suicide note with chatgpt’s enthusiastic help, fucknozzle? Is your brain so rotted that you forgot the context window of this conversation already?

          • Grimy@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            24 days ago

            You can’t defend your position because it’s emotional exaggeration. Now you’re lashing out and being insulting.

            My whole point is that they aren’t the same and you keep saying “let’s treat them as if they were”, then you use it in comparisons and act like a child when I point out how silly that is.

            Clarify what you mean. Take the gun out of the conversation and stop bringing it up. Stop being disingenuous. Don’t be a baby.

    • JPAKx4@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 days ago

      Have you seen the ai girlfriends/boyfriends communities? I genuinely think the rate of chatgpt induced psychosis is really high, even if it doesn’t lead to death