Do chatgpt or other language models help you code more efficiently and faster? Is it worth spending your money for it?

  • Skull giver@popplesburger.hilciferous.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I like Copilot, but I’ve found that uploading code to Microsoft is not exactly popular among employers. Local models aren’t very good in comparison, and feel more like the output of ChatGPT.

    For code and config samples, talking to Bing and Gemini is like talking to a junior dev that’s somewhere on the spectrum. I ask for a simple example or a quick function, and I get a load of documentation that I already know, followed by a sample that doesn’t really work. When I indicate the problem and ask for a fix, it confabulates a fix using some non-existent methods or config that it just assumes are present, and then infodumps some more related documentation.

    By the fourth time I’ve asked it to fix the obvious flaws in its approach or code, it forgot the original question and answers, and produces either snippets that have very little to do with the original task or it comes up with the same solution that I indicated had issues the first time.

    Talking to AI is like talking to a child with mild special needs. You need to be precise, you need to anticipate its reaction and compensate for potential distractions that may distract it halfway through its thought process, and it’s best to ask a completely different question if it shows the slightest sign of misinterpretation. Sometimes you need to manipulate it by promising things you’ll never live up to, like promising a $100 tip, because AI is not a real person and it’s okay to lie and deceive it even if it acts like a weird child.

    When it comes to writing, it’s like a kid that thinks that long words make you sound smart. I went through that phase at some point and I bet I was insufferable. It takes basic sentences and turns them into empty paragraphs. You tell it to be concise, and it repeats what you said with fancy sounding synonyms. Tell it to use simpler words, and it starts changing the meaning so that it can find simple words. And God forbid you ask a question in a language other than English.

    Oh, and it’ll still lie about basic things. After the latest updates, neither Google nor Microsoft will claim that the word ‘hound’ contains the letter F. However, Gemini claims there are no vowels in “hound” because “ou” is a diphthong (which is a type of vowel) and therefore does not contain any vowels. It still can’t do basic counting (“add the number of vowels to the number of consonants in the word” claims “nd” is a consonant to make the amount of letters work out to be 5). I, too, had a phase like that in primary school, where I figured out the answer to something and worked back from there instead of reasoning forward, and I was hilariously wrong about all kinds of things.

    Everything, and I mean everything, needs to be hand-checked as if you asked an intelligent primary school kid to write your blog. Assume every word is a lie, and validate what it’s saying, and you can save yourself the effort of writing stuff, but in my experience you end up spending more time editing down generated fluff than you would spend writing. Perhaps others aren’t as good at getting started on writing a topic and use AI to generate a jumping-off-point, I can see the value in that, but you have to be very careful not to trust the output or you’ll get sent down wilde goose chases trying to expand on the points it generates.

    These models are great for shitting out boilerplate, but if you need anything remote custom, I find that getting the AI to do its job is more time consuming than figuring out the solution manually. I suppose it may seem like a great tool for people without too much experience because of that, but I’m not so sure if the chat bot AIs are very useful beyond their limited basic understanding.

    However, AI is good at two things: actually finding web results based on a non-exact query (Google/DDG/Bing are ruined by SEO spammers, but AI seems to cut through that) and generating cool icons and header images for my projects’ README.md because I can’t be bothered to copy/paste clipart from the internet. AI is still terrible at actually quoting things from the internet (it’ll happily invert the meaning of sources or confabulate a correlation between two articles) but if you ignore the text and just click the links, it can be quite effective.

    Oh, and one thing AI does seem to excel at, is to take your text as an English-as-a-second-language speaker, or perhaps someone from a certain disadvantaged neighbourhood, and make you sound like a white man in his 50s. This can be a real benefit to immigrants and people with frowned-upon dialects trying to do things like apply for a job or express themselves to higher-ups in a way that they respect. Not because AI is all that useful, but because people suck and attach arbitrary opinions and values to certain expressions and uses of language that aren’t racist per se, but do end up disadvantaging people that are not like them.