There once was a programmer
There once was a programmer
There once was a programmer
ChatGPT just makes me feel like I'm doing code review for junior developers who don't understand the task... wait...
For the love of God, if you're a junior programmer you're overestimating your understanding if you keep relying on chatGPT thinking 'of course I'll spot the errors'. You will until you won't and you end up dropping the company database or deleting everything in root.
All ChatGPT is doing is guessing the next word. And it's trained on a bunch of bullshit coding blogs that litter the internet, half of which are now chatGPT written (without any validation of course).
If you can't take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs then programming (or problem solving) just isn't for you. The junior end of this feel is really getting clogged with people who want to get rich quick without doing any of the legwork behind learning how to be good at this job, and ChatGPT is really exarcebating the problem.
If you can’t take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs
A lot of the time this is just looking for syntax though; you know what you want to do, and it's simple, but it is gated behind busywork. This is to me the most useful part about ChatGPT, it knows all the syntax and will write it out for you and answer clarifying questions so you can remain in a mental state of thinking about the actual problem instead of digging through piles of junk for a bit of information.
Somehow you hit an unpopular opinion landmine with the greybeard devs.
For the greybeard devs: Try asking ChatGPT to write you some Arduino code to do a specific task. Even if you don't know how to write code for an Arduino, ChatGPT will get you 95% of the way there with the proper libraries and syntax.
No way in hell I'm digging through forums and code repos for hours to blink an led and send out a notification through a web hook when a sensor gets triggered if AI can do it for me in 30 seconds. AI obviously can't do everything for you if you've never coded anything before, but it can do a damn good job of translating your knowledge of one programming language into every other programming language available.
Just a few days ago I read an article on the newest features of Kotlin 1.9. Zero of it was true.
Internet is littered with stuff like this.
If model is correct, you are correct. If model is not correct, you are working on false assumptions.
The more you grow in experience the more you're going to realize that syntax and organization is the majority of programming work.
When you first start out, it feels like the hardest part is figuring out how to get from a to b on a conceptual level. Eventually that will become far easier.
You break the big problem down into discrete steps, then figure out the besy way to do each step. It takes little skill to say "the computer just needs to do this". The trick is knowing how to speak to the computer in a way that can make sense to the computer, to you, and to the others who will eventually have to work with your code.
You're doing the equivalent of a painter saying "I've done the hard part of envisioning it in my head! I'm just going to pay some guy on fiver to move the brush for me"
This is difficult to put into words, as it's also not about memorization of every language specific syntax pattern. But there's a difference between looking up documentation or at previous code for syntax, and trying to have chatGPT turn your psuedocode notes into working code.
Never ask ChatGPT to write code that you plan to actually use, and never take it as a source of truth. I use it to put me on a possible right path when I'm totally lost and lack the vocabulary to accurately describe what I need. Sometimes I'll ask it for an example of how sometimes works so that I can learn it myself. It's an incredibly useful tool, but you're out of your damn mind if you're just regularly copying code it spits out. You need to error check everything it does, and if you don't know the syntax well enough to write it yourself, how the hell do you plan to reliably error check it?
I write a lot of bash and I still have to check syntax every day, but the answer to that is not chatGPT but a proper linter like shell check that you can trust because it's based on a rigid set of rules, not the black box of a LLM.
I can understand the syntax justification for obscure languages that don't have a well written linter, but if anything that gives me less confidence about CHATGPT because it's training material for an obscure language is likely smaller.
ChatGPT cannot explain, because it doesn't understand. It will simply string together a likely sequence of characters. I've tried to use it multiple times for programming tasks and found each time that it doesn't save much time, compared to an IDE. ChatGPT regularly makes up methods or entire libraries. I do like it for creating longer texts that I then manually polish, but any LLM is awful for factual information.
you can remain in a mental state of thinking about the actual problem
more like you'll end up wasting a significant amount of time debugging not only the problem, but also chatGPT, trying to correct the bullshit it spews out, often ignoring parts of your prompt
All ChatGPT is doing is guessing the next word.
You are saying that as if it's a small feat. Accurately guessing the next word requires understanding of what the words and sentences mean in a specific context.
Don't get me wrong, it's incredible. But it's still a variation of the Chinese room experiment, it's not a real intelligence, but really good at pretending to be one. I might trust it more if there were variants based on strictly controlled datasets.
Yup. Accurately guessing the next thought (or action) is all brains need to do so I don't see what the alleged "magic" is supposed to solve.
It's okay old man. There is a middle there where folks understand the task but aren't familiar with the implementation.
ChatGPT is banned by my employer, because they don't want trade secrets being leaked, which IMO is fair enough. We work on ML stuff anyway.
Anyway, we have a junior engineer that has been caught using ChatGPT several times, whether it's IT flagging its use, seeing a tab open in their browser during a demo, or simply just seeing code they obviously didn't write in code I'm reviewing.
I recently tried to help them out on a project that uses React, and it is clear as day that this engineer cannot write code without ChatGPT. The library use is all over the place, they'll just "invent" certain API's, or they'll use things that were deprecated/don't work if you've even attempted to think about the problem. IMO, reliance on ChatGPT is much worse than how juniors used to be reliant on Stack Overflow to find answers to copy paste.
I’m surprised these people can pass a technical interview. I imagine the employer doesn’t test candidates for something like this to happen.
One of the dirty secrets at FAANG companies is that lots of people join from internships, and can get all the way to senior and above without ever needing to go through a standard, full technical loop. If you have a formal apprenticeship scheme, sometimes you'll join through a non-tech loop.
Tbf some technical interviews are bs
The underlying problem is the same, it just became more accessible to copy code you don't understand (you don't even need to come up with a search query that leads you to some kind of answer, chatpgt will interpret your words and come up with something). Proper use of chatgpt can boost productivity, but people (both critics of chatgpt and people who don't actually know how to code) misuse it, look at it as a "magic solution box" instead of a tool that can assist development and lead you to solutions.
.....who wrote code without stack overflow
I've got no issues with people using stackoverflow or chatGPT as a reference. The problem has always been when anyone just skims what they found and just paste it in without understanding it. Without looking at the rest of the comments, further discussion, or looking at any other search results for further insight and context.
I think chatGPT makes this sort of "carelessness" (as opposed to carefulness) even easier to do, as it appears to be responding with an answer to your exact question and not just something the search algorithm thinks is related.
In days of yore, before Google or even Altavista, you could tell the quality of a team by how many O'Reilly books they had on the shelves.
I should sell mine. Maybe I'll keep the crab book and the white book, but the latter's not even an O'Reilly.
I find it to be suprisingly usless compared to classic aproach. But in my case it might be beacuse of the language i work with ( abap ).
I strongly advise not to do that. As others pointed out, it really is just predicting the next word. It is worth learning about how to problem solve and to recognize that the only way to become a better program is with practice. It's better to get programming advice from real people online and read the documentations for the functions and languages you are trying to use.
If the internet has succeeded in anything, it's that the illusion of competence is worth more than the thing itself. Until someone calls you out, that is.
Sage wisdom.
Notepad++ is perfectly fine to code in. With the wealth of plugins it has, it's pretty similar to vscode in how you can trick it out with all sorts of things it can't do by default.
ChatGPT is next level Rubber Duck. Tell it to talk to you like Socrates.
I can code a feature faster than i can debug ChatGPTs attempt. so long as it's in JS
ChatGPT is better at bash than me though
The good thing about ChatGPT is that it gives you a starting point for languages you're not familiar / rusty with.
I've always, always been a documentation-only guy. Meaning I almost never use anything other than the documentation for the languages and libraries I use. I genuinely don't feel that I'm missing out on anything, I already write code faster than my peers and I don't feel the need to try to be some sort of 10x developer.
Sometimes there are better methods to implement something, and we can learn from others’ mistakes without having to make them ourselves
I've always, always been a intuition only guy. Meaning I almost never use any thing other than blind guessing on how languages and libraries work. I genuinely don't feel I'm missing out on anything, my farts already smell better than the rest of my peers and I just don't feel the need to learn the modern tools of my trade.
There was once a programmer that wrote his own code
Of course the first programmer did, but everyone who came after just copied her work and tweaked it a bit to suit their needs.
Basically, yeah. Dennis Ritchie wrote the C compiler because he knew exactly what her wanted to use it for and the kinds of code that he wanted to write. Then he went on to write the book that everyone used to learn the language.
This is true of probably every language, library, framework, etc. The original designer writes it because he knows what he wants to do with it and does so. Then everyone else follows. People then add more features and provide demonstrations of how to use them, and others copy them. It is extremely hard to just look at an API and use that to figure out exactly which calls should be made and in what order. Everyone just reads from the examples and adapts them as needed.
I literally cannot comprehend coding with ChatGPT- How can I expect something to work if I don't understand it, and how can I understand it if I don't code and debug it myself? How can you expect to troubleshoot any issues afterwards if you don't understand the code? I wouldn't trust GPT for anything more complex than Hello World.
Just yesterday, I wrote a first version of a fairly complex method, then pasted it into GPT-4. It explained my code to me clearly, I was able to have a conversation with it about the code, and when I asked it to write a better version, that version ended up having a couple significant logical simplifications. (And a silly defect that I corrected it on.)
The damn thing hallucinates sometimes (especially with more obscure/deep topics) and occasionally makes stupid mistakes, so it keeps you on your toes a bit, but it is nevertheless a very valuable tool.
That only really works, if the method is self-contained, and written in a language that GPT has seen often (such as python). I stopped using it, because for 1 in 10 successful tries I waste time for the other 9 tries...
You shouldn't use code that you don't understand. Chatgpt outputs quite readable and understandable code and makes sure to explain a lot of it and you can ask questions about it.
It can save quite a lot of effort, especially for tasks that are more tedious than hard. Even more if you have a general idea of what you want to do but you're not familiar with the specific tools and libraries that you want to use for the task.
It's also wrong a lot. Hence the requirement for understanding. It can be helpful to get through a stretch but it will fuck up before too long and relying on it entirely is a bad idea.
This.
If I'm writing something slightly more complex, ChatGPT(4) is mostly failing.
If I'm writing complex code, I don't even get the idea of using ChatGPT, because I'm only getting disappointed, and in the end waste more time trying to "engineer" the prompt, only to get disappointed again.
I currently cannot imagine using ChatGPT for coding, I was excited in the beginning, and it's sometimes useful, but mostly not really for coding...
If you're already knee deep in existing code and looking for bugs or need to write quite specific algorithms it seems not very useful. But if you for some reason need to write stuff that has the slightest feeling of boilerplate, like how do I interact with well established framework or service X while doing A, B C it can be really useful.
I haven't been in web development in over 20 years; thanks to ChatGPT, I was able to get up-to-speed and start building websites again, when in the past I would have never been able to do so.
GPT is a powerful tool that can allow anyone to do anything if they're willing to put in the effort. We should be praising it, not making fun of it. It's as revolutionary as the internet itself.
Often the code is self explanitory. I understand the code very often, but I still couldn't write it correctly from scratch. You never feel like that?
This is how code examples in books works too. You get some code to look at and try to understand it. Otherwise it's like you would ignore code examples while learning programming.
I use it to give me prototypes for ansible because Ansible is junk. Then I build my stuff from the mishmash and have GPT check it. Cuts a lot of time down that I'd rather be doing any-bloody-thing else with.
If you're doing something extremely skillfully, chat gpt will make the dumbest suggestions ever...
Chatgpt is good for learning ideas and new things as an aggregate of what everyone thinks about it. But as a coding tool it cannot reason properly and has rubber stamp solutions for everything.
Well yes it's responses are based on what the average of the internet would say.
I'm surprised it doesn't constantly tell you to format windows and reinstall no matter what you ask
What about a programmer that doesn't use stack overflow.
Today we have chatbots. Yesterday we had search engines and stack overflow. Before that we had books. And before that? Well what do you know... software programming is a relatively novel field. It's almost as if nobody has perfected how it should be learned.
The most valuable knowledge comes from experience. I copied plenty of code around during my learning days as well, and I still do it today. The most important part however is trying to understand the code you're working with. If you can understand it, know when it fails, test it in the right way, etc., then sure, you could probably learn to code from chatbots. They provide the information, and you're at liberty to do what you want with it. If you just copy it and forget, you'll be a bad programmer. But it's not like you couldn't do that before either with the other sources that were available - there were plenty of bad programmers before we had these tools available too.
That said, there is a risk that these chatbots do not provide any useful context around the code that they produce. When you learned from a book or stack overflow, you were reading from a reasonably authoritative source that could explain the code that was produced. But the authority behind the code from chatbots is probably much weaker than what we have from stack overflow, which in turn was probably also weaker than what we have from books. Does it have an effect or learning? I have no clue. But I still think you can learn from chatbots if you use the output that they provide in the right way. (Disclaimer: I have never used one of them and have no experience with them.)
As someone who is learning, I think it's imperative to understand that chatgpt has limitations that cannot be overlooked. It's pretty good if I make some silly syntax or formatting errors, but at the core I have to understand what I'm working with if I want to be a better programmer. I love the conversational nature because I often have a hard time wording questions, so it helps me in that regard as well. Idk if you want to be truly good at something you have to be more reliant on yourself than external tools.
The thing is, in some fields like devops, there are so many tools that you can't remember or know all of them very well. So asking chatgpt how to do something saves very much time. It can write ansible playbooks, docker files, web server configurations etc etc. They almost never work perfectly but they give a very good starting point to modify.
It used to be that you could be very good at specific languages or tools but today, there isn't enough time. Everyone is always in a hurry to get something out as quickly as possible too.
ChatGPT was never made for programming and is horrible at generating code. It is nice for a peer-programming kinda setup tho, because it can quickly point you towards tools, libraries, APIs etc. to use
It generated a custom needs GUI OCR tool in Qt5. I don't know a single bit of Qt5 and went from zero to working tool in half an hour.
The tool takes a screenshot, lets me select an area on the screen, OCRs it and displays the text in a window.
If ChatGPT isn't made for programming then I'm looking forward for a product that is.
I cant use chatgpt with godot :(
Why not? I haven't had the best time with it, but it certainly can write gdscript
Godot 4.0, which was released in 2023, changed a lot of gdscript
Chatgpt only knows gdscript from godot 3.x
You're phrasing at the top there, started a sea shanty, but with your lyrics, in my head.