ChatGPT is hilariously incompetent... but on a serious note, I still firmly reject tools like copilot outside demos and the like because they drastically reduce code quality for short term acceleration. That's a terrible trade-off in terms of cost.
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
I enjoy using copilot, but it is not made to think for you. It's a better autocomplete, but don't ever let it do more than a line at once.
Yup, AI is a tool, not a complete solution.
As a software engineer, the number of people I encounter in a given week who either refuse to or are incapable of understanding that distinction baffles and concerns me.
The problem I have with it is that all the time it saves me I have to use on reading the code. I probably spend more time on that as once in a while the code it produces is broken in a subtle way.
I see some people swearing by it, which is the opposite of my experience. I suspect that if your coding was copying code from stack overflow then it indeed improved your experience as now this process is streamlined.
Biggest problem with it is that it lies with the exact same confidence it tells the truth. Or, put another way, it's confidently incorrect as often as it is confidently correct - and there's no way to tell the difference unless you already know the answer.
it's kinda hilarious to me because one of the FIRST things ai researchers did was get models to identify things and output answers together with the confidence of each potential ID, and now we've somehow regressed back from that point
did we really regress back from that?
i mean giving a confidence for recognizing a certain object in a picture is relatively straightforward.
But LLMs put together words by their likeliness of belonging together under your input (terribly oversimplified).the confidence behind that has no direct relation to how likely the statements made are true. I remember an example where someone made chatgpt say that 2+2 equals 5 because his wife said so. So chatgpt was confident that something is right when the wife says it, simply because it thinks these words to belong together.
they drastically reduce code quality for short term acceleration.
Oh boy do I have news for you, that's basically the only thing middle managers care about, short tem acceleration
But LinkedIn bros and corporate people are gonna gobble it up anyways because it has the right buzzwords (including “AI”) and they can squeeze more (low quality) work from devs to brag about how many things they (the corporate owners) are doing.
It's just a fad. There's just a small bit that will stay after the hype is gone. You know, like blockchain, AR, metaverse, NFT and whatever it was before that. In a few years there will be another breakthrough with it and we'll hear from it again for a short while, but for now it's just a one trick pony.
Yeah, they think it can turn a beginner dev into an advanced dev, but really it's more like having a team of beginner devs.
I'm still convinced that GitHub copilot is actively violating copyleft licenses. If not in word, then in the spirit.
they drastically reduce ... quality for short term acceleration
Western society is built on this principle
I predict that, within the year, AI will be doing 100% of the development work that isn't total and utter bullshit pain-in-the-ass complexity, layered on obfuscations, composed of needlessly complex bullshit.
That's right, within a year, AI will be doing .001% of programming tasks.
Can we just get it to attend meetings for us?
Legitimately could be a use case
"Attend this meeting for me. If anyone asks, claim that your camera and microphone aren't working. After the meeting, condense the important information into one paragraph and email it to me."
Here is a summary of the most important information from that meeting. Since there were two major topics, I've separated them into two paragraphs.
- It is a good morning today.
- Everyone is thanked for their time. Richard is looking forward to next week's meeting.
The rest of the information was deemed irrelevant to you and your position.
Big companies will take 5 years just to get there.
"look i registered my own domain name all by myself!"
the domain: "localhost"
I'm an elite hacker and I grabbed your IP address from this post. It's 192.168.0.1 just so you know I'm not bluffing.
Heheh I'm ddossing them right now. Unfortunately the computer I'm doing it on is having a few connection issues
Haha punk it's actually 192.168.1.1. you dun goofed
Engineering is about trust. In all other and generally more formalized engineering disciplines, the actual job of an engineer is to provide confidence that something works. Software engineering may employ fewer people because the tools are better and make people much more productive, but until everyone else trusts the computer more, the job will exist.
If the world trusts AI over engineers then the fact that you don't have a job will be moot.
I just used copilot for the first time. It made me a ton of call to action text and website page text for various service pages inwas creating for a home builder. It was surprisingly useful, of course I modified the output a bit but overall saved me a ton of time.
Copilot has cut my workload by about 40% freeing me up for personal projects
Copilot is only dangerous in the hands of people who couldn't program otherwise. I love it, it's helped a ton on tedious tasks and really is like a pair programmer
Yeah it's perfect for if you can distinguish between good and bad generations. Sometimes it tries to run on an empty text file in vscode and it just outputs an ingredients list lol
Copilot has cut my personal projects by about 40% freeing me up for work
I think the correct response is "Wow. Has your mom seen it? Send her the link."
AI is only as good as the person using it, like literally any other tool in human existence.
It's meant to amplify the workload of the professional, not replace them with a layman armed with an LLM.
This AI thing will certainly replace my MD to HTML converter and definitely not misplace my CSS and JS headers
You don't need to convince the devs, you need to convince the managers.
Tbf I don't really wanna do ops work. I barely even wanna do DevOps. Let me just dev
Wow, there is a lot of pearl-clutching and gatekeeping ITT. It's delicious!
On a more serious note, ChatGPT, ironically, does suck at webdev frontend. The one task that pretty much everyone agrees could be done by a monkey (given enough time) is the one it doesn't understand at all.
Don't forget that GPT4 was getting dumber the more it learned from people.
These morons are probably going to train AI wrong so job security for the next 100 years.
The only thing ChatGPT etc. is useful for, in every language, is to get ideas on how to solve a problem, in an area you don't know anything about.
ChatGPT, how can I do xy in C++?
You can use the library ab, like ...
That's where I usually search for the library and check the docs if it's actually possible to do it this way. And often, it's not.