• groats_survivor@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    20 hours ago

    Me to AI: alright, I’m about to send you a two part message. Do not respond to the first message.

    AI: Gotcha! I won’t respond

    • Victor@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      10 hours ago

      How would it know not to respond to the first part without processing it first? The request makes no sense.

      Like telling a human, hey, don’t listen to this first part! Also don’t think about elephants!

    • davidgro@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      18 hours ago

      That could reasonably be interpreted as you haven’t sent the first part yet.

      But I assume it still responds like that when you do.