Even blenders have safeguards though, if the pitcher isn’t installed most won’t work. I don’t think it’s insane to require some sort of safety with LLMs.
I think the metaphor is finetuning a LLM for ‘safety’ is like trying to engineer the blades to be “finger safe”, when the better approach would be to guard against fingers getting inside an active blender.
Finetuning LLMs to be safe is just not going to work, but building stricter usage structures around them will. Like tools.
But this goes against Altman’s assertion that they’re magic AGI crystal balls (in progress), which would pop his bubble he’s holding up.
Even blenders have safeguards though, if the pitcher isn’t installed most won’t work. I don’t think it’s insane to require some sort of safety with LLMs.
I think the metaphor is finetuning a LLM for ‘safety’ is like trying to engineer the blades to be “finger safe”, when the better approach would be to guard against fingers getting inside an active blender.
Finetuning LLMs to be safe is just not going to work, but building stricter usage structures around them will. Like tools.
But this goes against Altman’s assertion that they’re magic AGI crystal balls (in progress), which would pop his bubble he’s holding up.
The pitcher doesn’t stop you from sticking your fingers into it if you try, it just makes accidents less likely. Same thing here.