this post was submitted on 11 Feb 2025
479 points (96.3% liked)

Technology

62073 readers
4978 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] DrFistington@lemmy.world 25 points 22 hours ago (4 children)

What most people forget is that as a programmer/designer/etc, your job is to take what your client/customer tells you they want, listen to them, then try to give them what they ACTUALLY NEED, which is something that I think needs to be highlighted. Most people making requests to programmers, don't really even know what they want, or why they want it. They had some meeting and people decided that, 'Yes we need the program to do X!' without realizing that what they are asking for won't actually get them the result they want.

AI will be great at giving people exactly what they ask for...but that doesn't mean its what they actually needed...

[–] heavydust@sh.itjust.works 1 points 4 hours ago* (last edited 4 hours ago)

Yesterday the test team asked me for 3 new features to help them. I thought about it for a few minutes and understood that these features are all incompatible. You can get one and only one. Good luck finding an AI that understands this.

[–] RedSeries@lemmy.blahaj.zone 8 points 20 hours ago (1 children)

Great points. Also:

... AI will be great at giving people exactly what they ask for ...

Honestly, I'm not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it's just as likely to regurgitate crap. I don't even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company's want to even touch AI for most development.

Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It's bound to run into something novel eventually, especially if there aren't more datasets to pull in because human-generated development solutions become scarce.

[–] homesweethomeMrL@lemmy.world 5 points 18 hours ago

AI will never not-require a human to hand hold it. Because AI can never know what's true.

Because it doesn't "know" anything. It only has ratios of usage maps between connected entities we call "words".

Sure, you can run it and hope for the best. But that will fail sooner or later.

[–] andrew_bidlaw@sh.itjust.works 3 points 19 hours ago

Also, LLM doesn't usually have memory or experience. It's the first page of Google search every time you put in your tokens. A forever trainee that would never leave that stage in their career.

Human's abilities like pattern recognition, intuition, acummulation of proven knowledge in combination makes us become more and more effective at finding the right solution to anything.

The LLM bubble can't replace it and also actively hurts it as people get distanced from actual knowledge by the code door of LLM. They learn how to formulate their requests instead of learning how to do stuff they actually need. This outsourcing makes sense when you need a cookie recipe once a year, it doesn't when you work in a bakery. What makes the doug behave each way? You don't need to ask so you wouldn't know.

And the difference between asking like Lemmy and asking a chatbot is the ultimative convincing manner in which it tells you things, while forums, Q&A boards, blogs handled by people usually have some of these humane qualities behind replies and also an option for someone else to throw a bag of dicks at the suggestion of formating your system partition or turning stuff off and on.

[–] HubertManne@moist.catsweat.com 2 points 21 hours ago (1 children)

that stuff should really get worked out in the agile process as the customer reacts to each phase of the project.

[–] BombOmOm@lemmy.world 2 points 12 hours ago* (last edited 12 hours ago) (1 children)

Getting the real requirements nailed down from the start is critical, not just doing the work the customer asked for. Otherwise, you get 6 months into a project and realize you must scrap 90% of the completed work; the requirements from the get-go were bad. The customer never fundamentally understood the problem and you never bothered to ask. Everyone is mad and you lost a repeat customer.

[–] HubertManne@moist.catsweat.com 2 points 11 hours ago

yeah but with agile they should be checking the product out when its a barely working poc to determine if the basic idea is what they expect and as it advances they should be seeing each stage. Youll never get the proper requirements by second guessing what they say.