• Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    They already got rid of the loophole a long time ago. It’s a good thing tbh since half the people using local models are doing it because OpenAI won’t let them do dirty roleplay. It’s strengthening their competition and showing why these closed models are such a bad idea, I’m all for it.

    • felixwhynot@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Did they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.