If that’s true, how come there isn’t a single serious project written exclusively or mostly be an LLM? There isn’t a single library or remotely original application made with Claude or Gemini. Not one.
My last employer had many internal tools that were fine.
They had only a moderate amount of oversight.
I had to find a new job, I’m actually thinking of walking away from software development now that there are so few jobs :(
It sucks but there’s no sense pretending this won’t have a large impact on the job landscape.
What did these tools do? I don’t see any LLm being used to creating anything working from scratch, without the human propmter doing most of the heavy lifting.
Mostly internal data cleaning stuff, close etc, which I accept is less in scope than you’re original comment.
The things you are describing sound like if-statement levels of automation, GitHub Actions with preprogrammed responses rather than LLM whatever.
If you’re worrying about being replaced by that… Go find the code, read it, and feel better.
The code was non trivial and relatively sophisticated. It performed statistical analysis on ingested data and the approach taken was statistically sound.
I was replaced by that. So was my colleague.
The job market is exceptionally tough right now and a large part of that is certainly llms.
I think taking people with statistical training out of the equation is quite dangerous, but it’s happening. In my area, everybody doing applied mathematics, statistics or analysis has been laid off.
In saying that, the produced program was quite good.
Certainly sounds more interesting than my original read of it! Sorry about that, I was grumpy.
Lets wait for any LLM do a single sucessful MR on Github first before starting a project on its own. Not aware of any.
there isn’t a single serious project written exclusively or mostly by an LLM? There isn’t a single library or remotely original application
IMHO “original” here is the key. Finding yet another clone of a Web framework ported from one language to another in order to push online a basic CMS slightly faster, I can imagine this. In fact I even bet that LLM, because they manipulate words in languages and that code can be safely (even thought not cheaply) tested within containers, could be an interesting solution for that.
… but that is NOT really creating value for anyone, unless that person is technically very savvy and thus able to leverage why a framework in a language over another creates new opportunities (say safety, performances, etc). So… for somebody who is not that savvy, “just” relying on the numerous existing already existing open-source providing exactly the value they expect, there is no incentive to re-invent.
For anything that is genuinely original, i.e something that is not a port to another architecture, a translation to another language, a slight optimization, but rather something that need just a bit of reasoning and evaluating against the value created, I’m very skeptical, even less so while pouring less resources EVEN with a radical drop in costs.
If you go forward 12 months the AI bubble will have burst. If not sooner.
Most companies who bought into the hype are now (or will be soon) realizing it’s nowhere near the ROI they hoped for, that the projects they’ve been financing are not working out, that forcing their people to use Copilot did not bring significant efficiency gains, and more and more are realizing they’ve been exchanging private and/or confidential data with Microsoft and boy there’s a shitstorm gathering on that front.
If you have the ability to build an AI app in house - holy shit shit that can improve productivity. Copilot itself for office use… Meh so far.
The most successful ML in-house projects I’ve seen took at least 3 times as long than initially projected to become usable, and the results were underwhelming.
You have to keep in mind that most of the corporate ML undertakings are fundamentally flawed because they don’t use ML specialists. They use eager beavers who are enthusiastic about ML and entirely self-taught and will move on in 1 year and want to have “AI” on their resume when they leave.
Meanwhile, any software architect worth their salt will diplomatically avoid to give you any clear estimate for anything having to do with ML – because it’s basically a black box full of hopes and dreams. They’ll happily give you estimates and build infrastructure around the box but refuse to touch the actual thing with a ten foot pole.
But coding never was the difficult part. It’s understanding a concept, identify a problem and solve it with the possible methods. An AI just makes the coding part faster and gives me options to quicker identify a possible solution. Thankfully there’s a never ending pile of projects, issues, todos and stackholder wants, that I don’t see how we need less programmers. Maybe we need more to deal with AI, as now people can do a lot more in house instead of outsourcing, but as soon as that threshold is reached, companies will again contact large software companies. If people want to put AI into everything, you need people feeding the AI with company specific data and instruct people to use this AI.
All I see is middle management getting replaced, because instead of a boring meeting, I could just ask an AI.
It’s been said before but the whiter your collar the more likely you are to be replaced by AI simply because the grunts tend to do more varied less pleibeon things.
Middle managers tend to write a lot of documents and emails which is something AI excels at. The programmers meanwhile have to come up with creative solutions to problems, and AI is less good at being creative, it basically just copy pastes known solutions from the web.
Realises devs have always joked about their jobs just being about copy-pasting solutions from StackOverflow 80% of the time
Oh God…
I dread meetings and I can’t wait for AIs to replace those managers. Or perhaps we’ll have even more meetings because the management wants to know why we’re so late despite the AI happily churning out meaningless codes that look so awesome like all that CSI VB GUI crap.
That’s when you write an AI auto reply cron. Let the snake eat its tail. Hehe
Guys selling something claim it will make you taller and thinner, your dick bigger, your mother in law stop calling, and work as advertised.
I guess the programmers should start learning how to mine coal…
We will all be given old school Casio calculators a d sent to crunch numbers in the bitcoin mines.
I seem to recall about 13 years ago when “the cloud” was going to put everyone in IT Ops out of a job. At least according to people who have no idea what the IT department actually does.
“The cloud” certainly had an impact but the one thing it definitely did NOT do was send every system and network admin to the unemployment office. If anything it increased the demand for those kinds of jobs.
I remain unconcerned about my future career prospects.
Yes… because there will be users who will always refuse to fix their own computer issues. Even if there’s an easy solution at their fingertips. Many don’t even try to reboot. They just tell IT to fix it… then go get coffee for a half hour.
It’ll replace brain dead CEOs before it replaces programmers.
I’m pretty sure I could write a bot right now that just regurgitates pop science bullshit and how it relates to Line Go Up business philosophy.
ChatJippity
I’ll start using that!
if lineGoUp { CollectUnearnedBonus() } else { FireSomePeople() CollectUnearnedBonus() }
I think we need to start a company and commence enshittification, pronto.
This company - employee owned, right?
I’m just going to need you to sign this Contributor License Agreement assigning me all your contributions and we’ll see about shares, maybe.
Yay! I finally made it, I’m calling my mom.
I love how even here there’s line metric coding going on
Says the person who is primarily paid with Amazon stock, wants to see that stock price rise for their own benefit, and won’t be in that job two years from now to be held accountable. Also, who has never written a kind of code. Yeah…. Ok. 🤮
As software developer I am not scared that A.I will take away our jobs. What I am scared is that at that point A.I good enough to do most jobs out there.
All it really needs to do is replace large chunk of the service industry to do wreck massive havock in our society.
If enshitification isn’t stopped, the job market could devolve to the point everyone that isn’t an “elite” will be living in a medival-like society and the only way to get food is by using a barter system to trade with other destitute poor people. The second hyperinflation hits, the rich and the poor will practically be living in different worlds. Learn either a medival skill or a skill that would be beneficial in such a society. I’m doing machining and blacksmithing. Might start dabbling in chemistry too. If I can’t be successful in modern society maybe I can be highly skilled and successful in whatever secondhand society emerges.
Connecting human existence to their labour has a designed defect
Everybody talks about AI killing programming jobs, but any developer who has had to use it knows it can do anything complex in programming. What it’s really going to replace is program managers, customer reps, makes most of HR obsolete, finance analysts, legal teams, and middle management. This people have very structured, rule based day to days. Getting an AI to write a very customized queuing system in Rust to suit your very specific business needs is nearly impossible. Getting AI to summarize Jira boards, analyze candidates experience, highlight key points of meetings (and obsolete most of them altogether), and gather data on outstanding patents is more in its wheelhouse.
I am starting to see a major uptick in recruiters reaching out to me because companies are starting to realize it was a mistake to stop hiring Software Engineers in the hopes that AI would replace them, but now my skills are going to come at a premium just like everyone else in Software Engineering with skills beyond “put a react app together”
Copilot can’t even suggest a single Ansible or Terraform task without suggesting invalid/unsupported options. I can’t imagine how bad it is at doing anything actually complex with an actual programming language.
It also doesn’t know what’s going on a couple line before it, so say I am in a language that has options for functional styling using maps and I want to keep that flow going, it will start throwing for loops at you, so you end up having to rewrite it all anyway. I have find I end up spending more time writing the prompts then validating it did what I want correctly (normally not) than just looking at the docs and doing it myself, the bonus being I don’t have to reprompt it again later because now I know how to do it
Trouble is, you’re basing all that on now, not a year from now, or 6 months from now. It’s too easy to look at it’s weaknesses today and extrapolate. I think people need to get real about coding and AI. Coding is language and rules. Machines can learn that enormously faster and more accurately than humans. The ones who survive will be those who can wield it as a tool for creativity. But if you think it won’t be capable of all the things it’s currently weak at you’re just kidding yourself unfortunately. It’ll be like anything else - a tool for an operator. Middlemen will be wiped out of the process, of course, but those with money remain those without time or expertise, and there will always be a place for people willing to step in at that point. But they won’t be coding. They’ll be designing and solving problems.
An inherent flaw in transformer architecture (what all LLMs use under the hood) is the quadratic memory cost to context. The model needs 4 times as much memory to remember its last 1000 output tokens as it needed to remember the last 500. When coding anything complex, the amount of code one has to consider quickly grows beyond these limits. At least, if you want it to work.
This is a fundamental flaw with transformer - based LLMs, an inherent limit on the complexity of task they can ‘understand’. It isn’t feasible to just keep throwing memory at the problem, a fundamental change in the underlying model structure is required. This is a subject of intense research, but nothing has emerged yet.
Transformers themselves were old hat and well studied long before these models broke into the mainstream with DallE and ChatGPT.
The real work of software engineering isn’t the coding. That is like saying that being a doctor is all about reading health charts. Planning, designing, testing and maintaining software is the hard part, and it is often much more political than it is a technical challenge. I’m not worried about getting replaced by AI. In fact, LLMs ability to generate high volumes of code only makes the skills to understand it to be more in demand.
We are 18 months into AI replacing me in 6 months. I mean… the CEO of OpenAI as well as many researchers have already said LLMs have mostly reached their limit. They are “generalizers” and if you ask them to do anything new they hallucinate quite frequently. Trying to get AI to replace developers when it hasn’t even replaced other menial office jobs is like saying “we taught AI to drive, it will replace all F1 drivers in 6 months”.
McDonald’s tried to get AI to take over order taking. And gave up.
Yeah, it’s not going to be coming for programmer jobs anytime soon. Well, except maybe a certain class of folks that are mostly warming seats that at most get asked to prep a file for compatibility with a new Java version, mostly there to feed management ego about ‘number of developers’ and serve as a bragging point to clients.
It’s based on the last few years of messaging. They’ve consistently said AI will do X, Y, and Z, and it ends up doing each of those so poorly that you need pretty much the same staff to babysit the AI. I think it’s actually a net-negative in terms of productivity for technical work because you end up having to go over the output extremely carefully to make sure its correct, whereas you’d have some level of trust with a human employee.
AI certainly has a place in a technical workflow, but it’s nowhere close to replacing human workers, at least not right now. It’ll keep eating at the fringes for the next 5 years minimum, if not indefinitely, and I think the net result will be making human workers more productive, not replacing human workers. And the more productive we are per person, the more valuable that person is, and the more work gets generated.
It’s tons easier to repkace CEOs, HR, managers and so on than coders. Coders needs to be creative, an HR or manager not so much. Are they leaving three months from now you think?
I’ll start worrying when they are all gone.
Todays news: Rich assholes in suits are idiots and don’t know how their own companies are working. Make sure to share what they’re saying.
While I highly doubt that becoming true for at least a decade, we can already replace CEOs by AI, you know? (:
https://www.independent.co.uk/tech/ai-ceo-artificial-intelligence-b2302091.html
Most middle managers could be replaced by a simple script already.
while(True): staffNumbers-=1 staffWorkload*=1.1 staffWages*=0.95 executiveWages*=1.2
CEOs without a clue how things work think they know how things work.
I swear if we had no CEOs from today on the only impact would be that we wouldve less gibberish being spoken
If AI could replace anyone… it’s those dingbats. I mean, what would you say, in this given example, the CEO does… exactly? Make up random bullshit? AI does that. Write a speech? AI does that. I love how these overpaid people think they can replace the talent but they… they are absolutely required and couldn’t possibly be replaced! Talent and AI can’t buy and enjoy the extra big yacht, or private jets, or over priced cars, or a giant over sized mansion… no you need people for that.
It’s the same claim when tools like Integromat, WayScript, PureData, vvvv and other VPLs (Visual Programming Languages) started to get some hype. I once worked for a company that strongly believed they’d “retire the need for coding”, and my ex-boss was so confident and happy about that… Although VPLs were a practical thing, time is the ruler of truth, and for every dev-related job vacancy I see, they ask some programming language, the written ones (JS, PHP, Python, Ruby, Lua, and so on).
Because if you look closely, deep inside, voila, there’s code in anything that is claimed to be no-code! Wow, could anyone imagine that? 🤯 /sarcasm
I made this meme a while back and I think it’s relevant
Looking at your examples, and I have to object at putting scratch in there.
My kids use it in clubs, and it’s great for getting algorithmic basics down before the keyboard proficiency is there for real coding.
that’s just how the code is rendered. There’s still all the usual constructs
It’s still code. What makes scratch special is that it structurally rules out syntax errors while still looking quite like ordinary code. Node editors – I have a love and hate relationship with them. When you’re in e.g. Blender throwing together a shader it’s very very nice to have easy visualisation of literally everything, but then you know you want to compute
abs(a) + sin(b) + c^2
and yep that’s five nodes right there because apparently even the possibility to type in a formula is too confusing for artists. Never mind that Blender allows you to input formulas (without variables though) into any field that accepts a number.
Nonsense. But then CEOs rarely know what the hell they’re talking about.