this post was submitted on 01 Oct 2023
1134 points (97.6% liked)

Technology

59594 readers
3194 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] tonytins@pawb.social 3 points 1 year ago (3 children)

While I appreciate them going a greener route, if these chat AIs are still this inefficient to simply train, maybe it is best left to return them back to the research phrase.

[–] fleabs@lemmy.world 20 points 1 year ago* (last edited 1 year ago) (1 children)

You say "simply train," but really, the training of these models is The most intensive part. Once they are trained, they require less power (relatively) to actually run for inference.

[–] Corkyskog@sh.itjust.works 0 points 1 year ago (1 children)

So it sounds like they need a shitload of GPU power. You know what also costs a shitload of GPU power crypto mining? Could they not outsource the work to all those GPUs that stopped mining crypto once it plummeted?

I am surprised this hasn't become a community project already. I assume there is some limitation that I am unaware of.

[–] pivot_root@lemmy.world 3 points 1 year ago (1 children)

The limitation is intellectual property. You need to model to train it, and no for-profit company is going to just give that away.

[–] Corkyskog@sh.itjust.works 1 points 1 year ago

But they (MS) are planning on doing it either way, why not crowdsource and even pay a small pittance for the GPU power? I think it would be popular... there are a lot of sad people with extra GPUs sitting around not being used for much.

[–] Fidelity9373@artemis.camp 4 points 1 year ago

There's tradeoffs. If training LLMs (and similar systems that feed on pure physics data) can improve nuclear processes, then overall it could be a net benefit. Fusion energy research takes a huge amount of power to trigger every test ignition and we do them all the time, learning little by little.

The real question is if the LLMs are even capable of revealing those kinds of insights to us. If they are, nuclear is hardly the worst path to go down.

[–] mojo@lemm.ee 3 points 1 year ago

That makes zero sense