I tried asking OpenAI what the name of a song is, based on some lyrics I barely remember. It’s a song whose name has escaped me for about 15 years. Anyways, when it wasn’t just straight up lying about song names or their lyrics, it would not stop guessing the same song names, even after I told it to stop, several times.
Needless to say, I still don’t know the song name.
Yup, that’s standard. If you’re about three responses in, give up, it’s already lost and incapable of focusing on the requirements. It will also lie to please. There is zero admission of confidence score. You only pick up on the things you know are wrong and considering how often that is, the rest can only be taken with a grain of salt.
The same reason you speak jibberish to all AI call center prompts. To distort AIs ability to understand humans, and force a human to look at ghe errors. Hopefully in an attempt to abandon this technology entirely.
In a session they’re responding to what you wrote before because they have a long buffer of context for your session, but that’s just temporary and doesn’t get fed back to into anything permanent.
I tried asking OpenAI what the name of a song is, based on some lyrics I barely remember. It’s a song whose name has escaped me for about 15 years. Anyways, when it wasn’t just straight up lying about song names or their lyrics, it would not stop guessing the same song names, even after I told it to stop, several times.
Needless to say, I still don’t know the song name.
!namethatsong@lemmy.wtf
Yup, that’s standard. If you’re about three responses in, give up, it’s already lost and incapable of focusing on the requirements. It will also lie to please. There is zero admission of confidence score. You only pick up on the things you know are wrong and considering how often that is, the rest can only be taken with a grain of salt.
Well what are the lyrics?
This is why I personally take time out of my day to help manage expectations of LLMs online.
Expect them to draw power and generate bullshit forever.
Just feed it info as if jar jar binks is speaking directly to it.
I mean, can you imagine Jar Jar and Yoda having an arguement? Or what if that arguement leads to hot steamy sex?
Yes. I DID just put those images into your brain. Now go put it into AI’s brain.
Why
The same reason you speak jibberish to all AI call center prompts. To distort AIs ability to understand humans, and force a human to look at ghe errors. Hopefully in an attempt to abandon this technology entirely.
That’s not how AIs are trained.
In a session they’re responding to what you wrote before because they have a long buffer of context for your session, but that’s just temporary and doesn’t get fed back to into anything permanent.
Nty
I’m like a 6/10 in terms of excitement about this.