I heard them describes bullshiting machines. The have no concept of, or regard for truth or lies, and just spout whatever sounds good. Much of the time it’s true. Too often its not. Sometimes it’s hard to tell the difference.
Yeah, but the point of the post is to highlight bias - and if there’s one thing an LLM has, it’s bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)
I mean LLMs can and will produce completely nonsensical outputs. It’s less of AI and more like a bad text prediction
Regurgitation machine prone to hallucinations is my go-to for explaining what LLMs really are.
I heard them describes bullshiting machines. The have no concept of, or regard for truth or lies, and just spout whatever sounds good. Much of the time it’s true. Too often its not. Sometimes it’s hard to tell the difference.
will they still be like that in ten years?
Yeah, but the point of the post is to highlight bias - and if there’s one thing an LLM has, it’s bias. I mean that literally: considering their probabilistic nature, it could be said that the only thing an LLM consists of is bias to certain words given other words. (The weights, to oversimplify)