- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
To be fair, most never could. I’ve been hiring junior devs for decades now, and all the ones straight out of university barely had any coding skills .
Its why I stopped looking at where they studied, I always first check their hobbies. if one of the hobbies is something nerdy and useless, tinkering with a raspberry or something, that indicates to me it’s someone who loves coding and probably is already reasonably good at it
I am not a professional coder, just a hobbyist, but I am increasingly digging into Cybersecurity concepts.
And even as an “amature Cybersecurity” person, everything about what you describe, and LLM coders, terrifies me, because that shit is never going to have any proper security methodology implemented.
On the bright side, you might be able to cash in on some bug bounties.
Im in uni learning to code right now but since I’m a boomer i only spin up oligarch bots every once in a while to check for an issue that I would have to ask the teacher. It’s far more important for me to understand fundies than it is to get a working program. But that is only because ive gotten good at many other skills and realize that fundies are fundamental for a reason.
This isn’t a new thing. Dilution of “programmer” and “computer” education has been going on for a long time. Everyone with an IT certificate is an engineer th se days.
For millennials, a “dev” was pretty much anyone with reasonable intelligence who wanted to write code - it is actually very easy to learn the basics and fake your way into it with no formal education. Now we are even moving on from that to where a “dev” is anyone who can use an AI. “Prompt Engineering.”
“Prompt Engineer” makes a little vomit appear in the back of my mouth.
I could have been a junior dev that could code. I learned to do it before ChatGPT. I just never got the job.
No wonder open source software becomes more efficient than proprietary one.
This post is literally an ad for AI tools.
No, thanks. Call me when they actually get good. As it stands, they only offer marginally better autocomplete.
I should probably start collecting dumb AI suggestions and gaslighting answers to show the next time I encounter this topic…
It’s actually complaining about AI, tho.
There are at least four links leading to AI tools in this page. Why would you link something when you complain about it?
to play the devil’s advocate: this can be done to exemplify what you complain about as opposed to complaining about an abstract concept
Oh lol I thought it was a text post, I didn’t even click the link and just read the post description.
The “about” page indicates that the author is a freelance frontend UI/UX dev, that’s recently switched to “helping developers get better with AI” (paraphrased). Nothing about credentials/education related to AI development, only some hobby projects using preexisting AI solutions from what I saw. The post itself doesn’t have any sources/links to research about junior devs either, it’s all anecdotes and personal opinion. Sure looks like an AI grifter trying to grab attention by ranting about AI, with some pretty lukewarm criticism.
that s the point of being junior. Then problems show up and they are forcing them to learn to solve them
All I hear is “I’m bad at mentoring”
There is only so much mentoring can do though. You can have the best math prof. You still need to put in the exercise to solve your differential equations to get good at it.
You get out of education what you put into it. You won’t be an artist from the best art school if you do the bare minimum to pass. You can end up as a legend of the industry coming from a noname school.
And some sort of “no one wants to work any more”.
I know young brilliant people, maybe they have to be paid correctly?
The problem is not only the coding but the thinking. The AI revolution will give birth to a lot more people without critical thinking and problem solving capabilities.
apart from that, learning programming went from something one does out of calling, to something one does to get a job. The percentage of programmers that actually like coding is going down, so on average they’re going to be worse
This is true for all of IT. I love IT - I’ve been into computer for 30+ years. I run a small homelab, it’ll always be a hobby and a career. But yeah, for more and more people it’s just a job.
That’s the point.
Along with censorship.
I could barely code when I landed my job and now I’m a senior dev. It’s saying a plumber’s apprentice can’t plumb - you learn on the job.
You’re not learning anything if Copilot is doing it for you. That’s the point.
That’s true, it can only get you so far. I’m sure we all started by Frankenstein-ing stack overflow answers together until we had to actually learn the “why”
100% agree.
I dont think there is no place for AI as an aid to help you find the solution, but i dont think it’s going to help you learn if you just ask it for the answers.
For example, yesterday, i was trying to find out why a policy map on a cisco switch wasn’t re-activating after my radius server came back up. Instead of throwing my map at the AI and asking whats wrong l, i asked it details about how a policy map is activated, and about what mechanism the switch uses to determine the status of the radius server and how a policy map can leverage that to kick into gear again.
Ultimately, AI didn’t have the answer, but it put me on the right track, and i believe i solved the issue. It seems that the switch didnt count me adding the radius server to the running config as a server coming back alive but if i put in a fake server and instead altered the IP to a real server then the switch saw this as the server coming back alive and authentication started again.
In fact, some of the info it gave me along the way was wrong. Like when it tried to give me cli commands that i already knew wouldn’t work because i was using the newer C3PL AAA commands, but it was mixing them up with the legacy commands and combining them together. Even after i told it that was a made-up command and why it wouldn’t work, it still tried to give me the command again later.
So, i dont think it’s a good tool for producing actual work, but it can be a good tool to help us learn things if it is used that way. To ask “why” and “how” instead of “what.”
Of course they don’t. Hiring junior devs for their hard skills is a dumb proposition. Hire for their soft skills, intellectual curiosity, and willingness to work hard and learn. There is no substitute for good training and experience.
As someone who has interviewed candidates for developer jobs for over a decade: this sounds like “in my day everything was better”.
Yes, there are plenty of candidates who can’t explain the piece of code they copied from Copilot. But guess what? A few years ago there were plenty of candidates who couldn’t explain the code they copied from StackOverflow. And before that, there were those who failed at the basic programming test we gave them.
We don’t hire those people. We hire the ones who use the tools at their disposal and also show they understand what they’re doing. The tools change, the requirements do not.
But how do you find those people solely based on a short interview, where they can use AI tools to perform better if the interview is not held in person?
And mind you the SO was better because you needed to read a lot of answers there and try to understand what would work in your particular case. Learn how to ask smartly. Do your homework and explain the question properly so as not to get gaslit, etc. this is all now gone.
I think that LLMs just made it easier for people who want to know but not learn to know. Reading all those posts all over the internet required you to understand what you pasted together if you wanted it to work (not always but the barr was higher). With ChatGPT, you can just throw errors at it until you have the code you want.
While the requirements never changed, the tools sure did and they made it a lot easier to not understand.
Have you actually found that to be the case in anything complex though? I find it just forgets parts to generate something. Stuck in an infuriating loop of fucking up.
It took us around 2 hours to run our coding questions through chatgpt and see what it gives. And it gives complete shit for most of them. One or two questions we had to replace.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
And then you get people like OP, blaming the generation while if anything its them and their company to blame… for falling behind. Got to keep up folks. Our field moves fast.
I find ChatGPT to sometimes be excellent at giving me a direction, if not outright solving the problem, when I paste errors I’m to lazy to look search. I say sometimes because othertimes it is just dead wrong.
All code I ask ChatGPT to write is usually along the lines for “I have these values that I need to verify, write code that verifies that nothing is empty and saves an error message for each that is” and then I work with the code it gives me from there. I never take it at face value.
Have you actually found that to be the case in anything complex though?
I think that using LLMs to create complex code is the wrong use of the tool. They are better at providing structure to work from rather than writing the code itself (unless it is something simple as above) in my opinion.
If a company cannot invest even a day to go through their hiring process and AI proof it, then they have a shitty hiring process. And with a shitty hiring process, you get shitty devs.
I agree with you on that.
My rule of thumb: Use ChatGPT for questions whos answer I already know.
Otherwise it hallucinates and tries hard in convincing me of a wrong answer.
Has anyone else clicked the chat.com url in the article …
To me, I feel like this is a problem perpetuated by management. I see it on the system administration side as well – they don’t care if people understand why a tool works; they just want someone who can run it. If there’s no free thought the people are interchangeable and easily replaced.
I often see it farmed out to vendors when actual thought is required, and it’s maddening.
i always found this to be upsetting as an IT tech at a former company - when a network or server had an issue and i was sent to resolve it, it was a “just reboot it” fix, which never kept the problem from recurring and bringing the server down at 07:00 the next Monday.
the limitations on the questions i could ask hurt that SLA more than any network switch’s memory leak ever did, and i felt as if my expertise meant nothing as a result.