Pro@programming.dev to Technology@lemmy.worldEnglish · 2 months agoThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgexternal-linkmessage-square53fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgPro@programming.dev to Technology@lemmy.worldEnglish · 2 months agomessage-square53fedilink
minus-squareGrandwolf319@sh.itjust.workslinkfedilinkEnglisharrow-up0·2 months agoMaybe, but even if that’s not an issue, there is a bigger one: Law of diminishing returns. So to double performance, it takes much more than double of the data. Right now LLMs aren’t profitable even though they are more efficient compared to using more data. All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
minus-squarerottingleaf@lemmy.worldlinkfedilinkEnglisharrow-up0·2 months ago All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb. Seemed superficially obvious. Human brain is a system optimization of which took energy of evolution since start of life on Earth. That is, infinitely bigger amount of data. It’s like comparing a barrel of oil to a barrel of soured milk.
minus-squareAItoothbrush@lemmy.ziplinkfedilinkEnglisharrow-up0·2 months agoIts very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
minus-squareRaptorBenn@lemmy.worldlinkfedilinkEnglisharrow-up0·2 months agoIf it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.
Maybe, but even if that’s not an issue, there is a bigger one:
Law of diminishing returns.
So to double performance, it takes much more than double of the data.
Right now LLMs aren’t profitable even though they are more efficient compared to using more data.
All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
Seemed superficially obvious.
Human brain is a system optimization of which took energy of evolution since start of life on Earth.
That is, infinitely bigger amount of data.
It’s like comparing a barrel of oil to a barrel of soured milk.
Its very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
If it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.