this post was submitted on 05 Feb 2025
521 points (96.8% liked)

Greentext

5094 readers
667 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] xelar@lemmy.ml 20 points 6 days ago* (last edited 6 days ago)

Brainless GPT coding is becoming a new norm on uni.

Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know how does it work?

Not to mention, you won't cheat out your way on recruitment meeting.

[–] Melatonin@lemmy.dbzer0.com 9 points 6 days ago

They're clever. Cheaters, uh, find a way.

[–] IndustryStandard@lemmy.world 9 points 6 days ago

Anon volunteers for Neuralink

[–] licheas@sh.itjust.works 4 points 6 days ago (2 children)

Why do they even care? it's not like your future bosses are going to give a flying fuck how you get your code. at least, they won't until you cause the machine uprising or something.

[–] fleg@szmer.info 11 points 6 days ago (1 children)

They are going to care if you can maintain your code. Programming isn't "write, throw it over the fence and forget about it", you usually have to work with what you - or your coworkers - have already done. "Reading other people's code" is, like, 95% of the programmers job. Sometimes the output of a week long, intensive work is a change in one line of code, which is a result of deep understanding of a project which can span through many files, sometimes many small applications connected with each other.

ChatGPT et al aren't good at that at all. Maybe they will be in the future, but at the moment they are not.

[–] licheas@sh.itjust.works 2 points 6 days ago (1 children)

Most people who are in dev aren’t maintaining shit.

Most coders don’t write new scripts, they’re cutting and pasting from whatever libraries they have - using ChatGPT is just the newest way to do that.

Your boss doesn’t care about how the job gets done- they care that it gets done. Which is why we have giant code libraries just chock full of snippets to jigsaw into whatever we need.

[–] fleg@szmer.info 1 points 6 days ago

Most people who are in dev aren’t maintaining shit.

I disagree, but maybe what I do in "dev" is a bubble where things are different.

[–] WoodScientist@sh.itjust.works 6 points 6 days ago

They absolutely will. Companies hire programmers because they specifically need people who can code. Why would I hire someone to throw prompts into ChatGPT? I can do that myself. In the time it take me to write to an employee instructing them on the code I want them to create with ChatGPT, I could just throw a prompt into ChatGPT myself.

[–] UnfairUtan@lemmy.world 213 points 1 week ago* (last edited 1 week ago) (23 children)

https://nmn.gl/blog/ai-illiterate-programmers

Relevant quote

Every time we let AI solve a problem we could’ve solved ourselves, we’re trading long-term understanding for short-term productivity. We’re optimizing for today’s commit at the cost of tomorrow’s ability.

load more comments (23 replies)
[–] kabi@lemm.ee 113 points 1 week ago (6 children)

If it's the first course where they use Java, then one could easily learn it in 21 hours, with time for a full night's sleep. Unless there's no code completion and you have to write imports by hand. Then, you're fucked.

[–] rockerface@lemm.ee 132 points 1 week ago (4 children)

If there's no code completion, I can tell you even people who's been doing coding as a job for years aren't going to write it correctly from memory. Because we're not being paid to memorize this shit, we're being paid to solve problems optimally.

load more comments (4 replies)
load more comments (5 replies)
[–] SkunkWorkz@lemmy.world 109 points 1 week ago (32 children)

Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.

[–] AeonFelis@lemmy.world 8 points 6 days ago
  1. Ask ChatGPT for a solution.
  2. Try to run the solution. It doesn't work.
  3. Post the solution online as something you wrote all on your own, and ask people what's wrong with it.
  4. Copy-paste the fixed-by-actual-human solution from the replies.
[–] WoodScientist@sh.itjust.works 2 points 6 days ago

Two words: partial credit.

[–] Maggoty@lemmy.world 1 points 6 days ago

Usually this joke is run with a second point of view saying, do I tell them or let them keep thinking this is cheating?

load more comments (29 replies)
[–] aliser@lemmy.world 104 points 1 week ago (1 children)
[–] Agent641@lemmy.world 66 points 1 week ago (1 children)

Probably promoted to middle management instead

load more comments (1 replies)
[–] TootSweet@lemmy.world 91 points 1 week ago (1 children)

generate code, memorize how it works, explain it to profs like I know my shit.

ChatGPT was just his magic feather all along.

load more comments (1 replies)
[–] nednobbins@lemm.ee 83 points 1 week ago (4 children)

The bullshit is that anon wouldn't be fsked at all.

If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that's called "studying".

[–] naught101@lemmy.world 6 points 6 days ago (3 children)

I don't think that's true. That's like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.

[–] RobertoOberto@sh.itjust.works 9 points 6 days ago (2 children)

I don't think that's quite accurate.

The "understand it well enough to explain it to a professor" clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you're actually learning something.

Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.

[–] naught101@lemmy.world 4 points 6 days ago* (last edited 6 days ago)

I didn't say you'd learn nothing, but the second task was not just to explain (when you'd have the code in front of you to look at), but to actually write new code, for a new problem, from scratch.

[–] Nalivai@lemmy.world 2 points 5 days ago* (last edited 4 days ago)

You don't need physical skills to program, there is nothing that needs to be honed in into the physical memory by repetition. If you know how to type and what to type, you're ready to type. Of you know what strings to pluck, you still need to train your fingers to do it, it's a different skill.

[–] nednobbins@lemm.ee 3 points 6 days ago (1 children)

It's more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.

Code from ChatGPT (and other LLMs) doesn't usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.

It's easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.

[–] WarlordSdocy@lemmy.world 3 points 6 days ago

I mean I feel like depending on what kind of problems they started off with ChatGPT probably could just solve simple first year programming problems. But yeah as you get to higher level classes it will definitely not fully solve the stuff for you and you'd have to actually go in and fix it.

[–] Maggoty@lemmy.world 2 points 6 days ago

No he's right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You'll obviously memorize some common stuff but things change really fast in the programming world.

load more comments (3 replies)
[–] boletus@sh.itjust.works 76 points 1 week ago (26 children)

Why would you sign up to college to willfully learn nothing

[–] GraniteM@lemmy.world 1 points 5 days ago

If you go through years of education, learn nothing, and all you get is a piece of paper, then you've just wasted thousands of hours and tens of thousands of dollars on a worthless document. You can go down to FedEx and print yourself a diploma on nice paper for a couple of bucks.

If you don't actually learn anything at college, you're quite literally robbing yourself.

load more comments (25 replies)
[–] SoftestSapphic@lemmy.world 54 points 1 week ago (9 children)

This person is LARPing as a CS major on 4chan

It's not possible to write functional code without understanding it, even with ChatGPT's help.

load more comments (9 replies)
[–] Simulation6@sopuli.xyz 51 points 1 week ago (15 children)

I don't think you can memorize how code works enough to explain it and not learn codding.

load more comments (15 replies)
load more comments
view more: next ›