Hello, recent Reddit convert here and I’m loving it. You even inspired me to figure out how to fully dump Windows and install LineageOS.
One thing I can’t understand is the level of acrimony toward LLMs. I see things like “stochastic parrot”, “glorified autocomplete”, etc. If you need an example, the comments section for the post on Apple saying LLMs don’t reason is a doozy of angry people: https://infosec.pub/post/29574988
While I didn’t expect a community of vibecoders, I am genuinely curious about why LLMs strike such an emotional response with this crowd. It’s a tool that has gone from interesting (GPT3) to terrifying (Veo 3) in a few years and I am personally concerned about many of the safety/control issues in the future.
So I ask: what is the real reason this is such an emotional topic for you in particular? My personal guess is that the claims about replacing software engineers is the biggest issue, but help me understand.
To me, it’s not the tech itself, it’s the fact that it’s being pushed as something it most definitely isn’t. They’re grifting hard to stuff an incomplete feature down everyone’s throats, while using it to datamine the everloving spit out of us.
Truth be told, I’m genuinely excited about the concept of AGI, of the potential of what we’re seeing now. I’m also one who believes AGI will ultimately be as a progeny and should be treated as such, as a beong in itself, and while we aren’t capable of generating that, we should still keep it in mind, to mould our R&D to be based on that principle and thought. So, in addition to being disgusted by the current day grift, I’m also deeply disappointed to see these people behaving this way - like madmen and cultists.
The people who own/drive the development of AI/LLM/what-have-you (the main ones, at least) are the kind of people who would cause the AI apocalypse. That’s my problem.
Agree, the last people in the world who should be making AGI, are. Rabid techbro nazi capitalist fucktards who feel slighted they missed out on (absolute, not wage) slaves and want to make some. Do you want terminators, because that’s how you get terminators. Something with so much positive potential that is also an existential threat needs to be treated with so much more respect.
Said it better than I did, this is exactly it!
Right now, it’s like watching everyone cheer on as the obvious Villain is developing nuclear weapons.
I‘ll just say I won‘t grand any machine even the most basic human rights until every last person on the planet has access to enough clean water, food, shelter, adequate education, state of the art health care, peace, democracy and enough freedom to not limit the freedom of others. That‘s the lowest bar and if I can think of other essential things every person on the planet needs I‘ll add them.
I don‘t want to live in a world where we treat machines like celebrities while we don‘t look after our own. That would be an express ticket towards disaster like we‘ve seen in many science fiction novels before.
Research towards AGI for AGI’s sake should be strictly prohibited until tech bros figure out how to feed the planet so to speak. Let‘s give them an incentive to use their disruptive powers for something good before they play god.
While I disagree with your hardline stance on prioritisation of rights (I believe any conscious/sentient being should be treated as such at all times, which implies full rights and freedoms), I do agree that we should learn to take care of ourselves before we take on the incomprehensible responsibility of developing AGI, yes.