this post was submitted on 23 Nov 2024
447 points (96.5% liked)

Technology

59594 readers
3373 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

you are viewing a single comment's thread
view the rest of the comments
[–] Naz@sh.itjust.works 7 points 17 hours ago* (last edited 15 minutes ago) (2 children)

Datacenter LLM tranches are 7-8 H100s per user at full load which is around 4 kW.

Multiply that by generation time and you get your energy used. Say it takes 62 seconds to write an essay (a highly conservative figure).

That's 68.8 Wh, so you're right.

Source: I'm an AI enthusiast

[–] oyo@lemm.ee 1 points 3 hours ago (1 children)

kW is a unit of instantaneous power; kW/s makes no sense. Note how multiplying that by seconds would cancel time out and return you power again instead of energy. You got there in the end, though.

[–] Naz@sh.itjust.works 1 points 16 minutes ago

Woop, noted, thanks

[–] bandwidthcrisis@lemmy.world 6 points 17 hours ago (1 children)

Well that's of the same order of magnitude as the quoted figure. I was suggesting that it sounded vastly larger than it should be.

[–] Naz@sh.itjust.works 6 points 17 hours ago

They're probably factoring in cooling costs and a bunch of other overhead, I dunno