this post was submitted on 20 Jun 2024
214 points (94.6% liked)

Technology

58492 readers
3899 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SlopppyEngineer@lemmy.world 13 points 3 months ago (3 children)

I've seen a junior using chatGPT to do the job while not really understanding what's going on and the end it was a big mess that didn't work. After I told him to read a "for dummies" book and he started to think for himself he got something decent out of it. It's no replacement for skill and thinking.

[–] melroy@kbin.melroy.org 4 points 3 months ago

exactly what I expected. It only will be worse. Since those juniors don't know what is good or wrong code for example. So they just assume whatever ChatGPT is saying is correct. They have no benchmark to compare.

[–] Alphane_Moon@lemmy.world 3 points 3 months ago

Had a very similar experience in pretty niche-use cases. LLMs are great if you understand the what you are dealing with, but they are no magical automation tool (at least in somewhat niche, semi-technical use cases where seemingly small errors can have serious consequences).

[–] jj4211@lemmy.world 2 points 3 months ago

That's been my experience so far, that it's largely useless for knowledge based stuff.

In programming, you can have it take "pseducode" and have it output actionable code for more tedious languages, but you have to audit it. Ultimately I find traditional autocompletion just as useful.

I definitely see how it helps cheat on homework, and extends "stock photography" to the point of really limiting the market for me photography or artists for bland business assets though.

I see how people find it useful for their "professional" communications, but I hate it because people that used to be nice and to the point are staying to explode their communication into a big LLM mess.