this post was submitted on 21 Oct 2024
507 points (97.7% liked)

Technology

58801 readers
4017 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] GeneralInterest@lemmy.world 11 points 2 hours ago (1 children)

Maybe it's like the dotcom bubble: there is genuinely useful tech that has recently emerged, but too many companies are trying to jump on the bandwagon.

LLMs do seem genuinely useful to me, but of course they have limitations.

[–] datelmd5sum@lemmy.world 4 points 1 hour ago (2 children)

We're hitting logarithmic scaling with the model trainings. GPT-5 is going to cost 10x more than GPT-4 to train, but are people going to pay $200 / month for the gpt-5 subscription?

[–] madis@lemm.ee 0 points 43 minutes ago (1 children)

But it would use less energy afterwards? At least that was claimed with the 4o model for example.

4o is also not really much better than 4, they likely just optimized it among others by reducing the model size. IME the "intelligence" has somewhat degraded over time. Also bigger Model (which in tha past was the deciding factor for better intelligence) needs more energy, and GPT5 will likely be much bigger than 4 unless they somehow make a breakthrough with the training/optimization of the model...

[–] GeneralInterest@lemmy.world 0 points 1 hour ago

Businesses might pay big money for LLMs to do specific tasks. And if chip makers invest more in NPUs then maybe LLMs will become cheaper to train. But I am just speculating because I don't have any special knowledge of this area whatsoever.